score:9
actually, a session is an immutable structure as well.
it's true that the session
object that is stored in the request
one has a +
method, this latter is respecting the immutability paradigm by returning a new instance of session
. keeping the request.session
unchanged.
thinking a step further, we can assert on the fact that an updated session has only a sense when reused in another request-response transaction...
so the way to update a session is to update it while building the response (result
in play), like that:
ok("just a test").withsession(request.session + ("token", "foobar"))
this will add the new session field in your cookie, that will be available in the next transaction (i.e. request-response).
Source: stackoverflow.com
Related Query
- Can't add elements to Play session
- Add data to session in play framework filter
- Play Framework - add a field to JSON object
- How to add elements to Source dynamically?
- Play 2.2.2 with IntelliJ 13 & SBT 0.13 cant run - No main class detected
- How do you change the Play 2.1! Framework session cookie name
- Add two tuples containing simple elements in Scala
- How to add AWS Java SDK to Scala / Play project
- Add elements after creation of rx Observable
- Can I call session in template/view on Play Framework
- Play framework handling session state
- How to add a prefix to all my routes in Play Framework 2?
- Better ways to implement more secure Play Scala framework session via cookie
- Add lift-json as build dependency for Play 2.0 project
- Play Framework 2.6 CSRF and Session
- How to add WebJars to my Play app?
- Play framework: how to monitor number of active sessions with standard session API?
- Add values to Session during testing (FakeRequest, FakeApplication)
- Using scala and java in play framework 2.1 : Session usage
- Scala Swing ListView's Remove / Add Elements Events
- Play Framework -- add new directories to classpath
- Set session cookies in play framework testing using selenium hd
- Keep session in subsequent Java calls to Play 2.0's fakeRequest
- Strange Play Framework 2.2 exceptions after trying to add MySQL / slick
- Play 2.0 Scala: default selected/checked on form elements
- How do I add filters for WebSocket requests in Play Framework
- Scala Play Framework Slick session doesn't work
- Play Framework cache Remove elements matching regex
- Play session cookie not sent by Safari
- Why I cant add field to scala implicit class?
More Query from same tag
- Scala-cat's IOApp in an OSGi context
- Using DI in application written in Scala and Java - is it possible in "Scala way" - without additional framework?
- Using method name "x" with implicit conversions in Scala
- Scala: efficiently comparing the contents of two lists, may include duplicates, ignoring order, not using sort
- Scala collection of dates and group by week
- Path-dependent types and generics
- Spark: Writing RDD Results to File System is Slow
- How to set a timeout for Async suite in Scalatest?
- Pinning list elements to position in a merged list in Scala
- remove the node that does not have particular value in XML using spark
- Scala project won't compile in Eclipse; "Could not find the main class."
- For debugging I would like println in functions with explicit return types
- Partitioning in Spark
- JavaScript file not loading in .jsp file
- Task not Serializable error:Spark
- Missing parameter type in applyOrElse Scala
- Scala Functional tests: How to do negative assertion?
- "missing parameter type" in groupBy
- Run a read-only test in Spark
- Pass date values from dataframe to query in Spark /Scala
- Wrapping generic function calls with one method
- '<==' unknown by intellij while using scalafx
- Concurrent Streams Does't Print Anything to Console
- Scala JSON Writes with calculated values
- Difference between RoundRobinRouter and RoundRobinRoutinglogic
- Scala compiler cannot determine which overloaded method to invoke
- What's the difference between view bounds and type bounds implementing evidence keyword?
- Can't Process Any String from a Txt file
- spark udf not being called
- How to use spark with large decimal numbers?