Your code works absolutely fine using Scala 2.10.3 with Scala-Redis 2.11 even with String interpolation. Tried on IntelliJ with SBT 0.13.
Try updating to the latest version of the client (if you are using an older version). If you are using SBT :
libraryDependencies += "net.debasishg" % "redisclient_2.10" % "2.11"
- Can't use integer values in a redis key
- Use cases for different sbt Key operators
- Redis client library recommendations for use from Scala
- When to use lazy values in Scala?
- Use Map to replace column values in Spark
- Scala collections, single key multiple values
- Decoding JSON values in circe where the key is not known at compile time
- How to use saved variable values outside of gatling scenario in scala file
- How to use ReduceByKey on multiple key in a Scala Spark Job
- Elegant way handling both missing key and null values from Scala Map
- How to sum values and group them by a key value in Scala's List of Map?
- Group values by a key with any Monoid
- Turn list of key/value pairs into list of values per key in spark
- How to store only latest key values in a kafka topic
- how to use spark intersection() by key or filter() with two RDD?
- Scala How To Use Map To Store Methods as values
- Single key contains Multiple values in Map in scala
- How to keep N latest values for key in kafka topic using kafka streams
- Scala: Sum values across arrays based on key
- In scala, how to get an array of keys and values from map, with the correct order (i-th key is for the i-th value)?
- Map of Iterables - a generic way to add values to an iterable under the given key
- tuple values for a key in scala rdd
- How to extract values from key value map?
- ScalaSpark - Create a pair RDD with a key and a list of values
- How to use scala combinators to give arbitrary values to expressions
- Best syntax to get primary key as an integer
- Spark: Efficient way to get top K frequent values per key in (key, value) RDD?
- How to use mutable map in Scala on Apache Spark? Key not found error
- Sum values of each unique key in Apache Spark RDD
- Is it possible to use lazy values within for-expressions?
More Query from same tag
- List of classes implementing a certain typeclass
- How to define default values optional fields in play framework forms?
- how to use achieve below requirement using spark RDD
- Adding resource generators from SettingKey of Seq[Reference] and Reference-scoped settings?
- What does .asInstanceOf[Foo[_ <: Bar]] mean?
- Play2 Framework Float Format in Template
- Apache Spark: Fixing the timestamp format
- how to match a spark RDD with results from JdbcRDD
- Pattern matching on function parameters
- Play 2.4 parameterized algebraic data types JSON validation
- How to cast Int to BigInt in Scala
- Spark: value mean is not a member of Array[Any]
- More efficient than acc.reverse ::: b?
- compare filenames to get the latest file using ID and timestemp information and merge into one file afterwards in spark scala
- How to reduce code repetition on AWS Deequ
- In spray for scala, get a FromRequestUnmarshaller from an ordinary Unmarshaller
- UDF scala return [max,index]
- Difference between 2 class definitions regarding contravariant type?
- Can't find a constructor for class play.api.cache.CacheApi in Play 2.5.x
- Append Map[String, String] to a Seq[Map[String, String]]
- Are variables de-allocated when they go out of scope
- Scala - "not found: value" when calling a function before defining it
- Why should we avoid public methods? Benefits of encapsulation
- Adding jar to scala classpath, object is not member of package
- How Can I submit multiple jobs in Spark Standalone cluster?
- Slick string sanitation
- Possible to embed information at compile time with SBT?
- Why is Scala returning a ';' expected but ',' found error?
- foldLeft on list of Doubles returns strange value
- Nicer alternatives to Option.get when get is guaranteed to succeed