What's the problem with simply adding
UserIdentity as an argument to your functions? Knowing who the user is seems to be important for your business logic - after all, today you want to log who performed the operation, tomorrow you will want to make sure this particular user is allowed to do it.
And I would just use a real
UserIdentity object, not some hack with
WrappedRequest, your services don't need to mess with a
- Logging user activity in another logical layer without global HttpContext
- Is there a way to "enrich" a Scala class without wrapping the code into another object?
- what happens when I use a global map variable in scala without broadcasting
- How to define sbt plugin task within prefix and without conflicts with global scope?
- Convert from a logical plan to another logical plan in Spark Catalyst
- Future[Future[T]] to Future[T] within another Future.map without using Await?
- Play framework - testing data access layer without started application
- Apache Spark. UDF Column based on another column without passing it's name as argument.
- scala call logging method without "log.yyyx"
- Yet another "Unable to instantiate activity ComponentInfo" with Scala
- How to change a task's behavior based on how a user calls it, without requiring them to change code?
- Scala - Optional Arguments without requiring user to input Some(...)
- Ideal datastore for User Activity Data?
- Scala shell not running without super user privilege on ubuntu
- How do I give global access to an object in Scala without making it a singleton or passing it to everything?
- how to initialize immutable val with scala without creating another val
- Logging without side effects
- How to add a list of elements into another list at a certain index without removing or replacing any elements
- Spark - Add a column which sums another column, grouped by a third column, without losing other columns
- Spark - Aggregate user activity to represent periods
- Call function automatically without an explicit call in every implementation of another function in subclasses
- Scala Spark Incrementing a column based on another column in dataframe without for loops
- Is there a way to append user input to an array in Scala without overriding my old input every time I call the function?
- Get values for global variables without reading file again
- Use a saved model to transform another data without fitting again Spark
- Find objects in one site but not another without using the set operation diff
- How to merge two scala list in to another list without using built in functions
- ScalaTest in sbt: is there a way to run a single test without tags?
- Logging in Scala
- How to turn off INFO logging in Spark?
More Query from same tag
- Use variable for type checking code in Scala
- How to Assign values to a List of Lists in scala using loops?
- Adding a new spark-package to scala spark
- Why is Ficus throwing an exception when I try to add a new field in my case class?
- What's the best way to read/write from/to Redshift with Scala spark since spark-redshift lib is not supported publicly by Databricks
- How to read multiple parquet tables?
- Using reducedByKey instead of GroupBy
- pgp signature failure on publishing artifact
- Error Calculating Spark DataFrame Calculate Standard Deviation
- How to access private specialized function?
- RDD[ x : Vector[String] ] to RDD[x: Vector[String] + Iterator : Vector[String] ]
- How to create a Dataframe programmatically that isn't StringType
- How to run futures on the current actor's dispatcher in Akka
- Minimal full SBT project with plugins
- Spark: Mapping elements of an RDD using other elements from the same RDD
- Apache Flink Streaming type mismatch in flatMap function
- Scala adding elements to seq and handling futures, maps, and async behavior
- Scala filter by nested Option/Try monads
- scala - unit test - How to mock a class method inside a class
- Custom mapping to nested case class structure in Slick (more than 22 columns)
- string to key-value conversion in spark scala
- How do I print the contents of an ApacheSpark RDD in my terminal?
- Scala Manifested with Parameterized Types
- update value of Accumulator in dtaframe.map function
- Can anyone give me a good example of This kind of situation about "What makes an object stateful"?
- generic to specific collection after partial function application
- Not able to connect Spark-Cloudant
- AWS S3 : Spark - java.lang.IllegalArgumentException: URI is not absolute... while saving dataframe to s3 location as json
- How to properly create a tree in scala
- How do I disable IntelliJ IDEA's type inspection?