score:0
There is not a good solution to this question for current slick version. You can pack some fields into one case class.
Please refer to this test case.
score:0
Actually this could be done via HList
like this
def * = (col1 :: col2 :: .. :: HNil).shaped <> (
{ case x => new YYY(x(0), x(1), ..)},
{ x: YYY => Option(x.col1 :: x.col2 :: .. :: HNil)}
)
I've written an macro to do the mapping, you may take a look at this https://github.com/jilen/slickext
score:0
I have a HListCaseClassShape that works exactly like the CaseClassShape but without the 22 column limit: here. You could then map it to your table like this: def * = MyHListCaseClassShape(f1, f2, f3...)
(see the Pair * example here). Not sure if it'll work with Slick 2 but might be worth a shot.
Source: stackoverflow.com
Related Query
- Map table with more than 22 columns to Scala case class by Slick 2.1.0
- how to map form having more attributes than case class in play 2 with scala
- How to use attribute names if the table has more than 22 columns so it is defined as a type. Slick / Scala
- scala - parse json of more than 22 elements into case class
- Play json merge formats for case class with more than 22 fields
- Slick 2.10-RC1, Scala 2.11.x, bypassing 22 arity limit with case class (heterogenous)
- How to create a scala case class instance with a Map instance
- Slick transform subset of Table Columns into a Case Class
- Change fields of case class based on map entries with simple type check in scala macros
- Scala case class copy method with Map parameter
- Scala problems with slick 2, when case class and object name are same
- Playframework and Scala Slick tuple with more than 22 fields
- Spark Scala case class with array and map datatype
- How to create schema in Spark with Scala if more than 100 columns in the input?
- How to Map with a case class in Scala
- Scala Slick - Insert in table with omitting some columns and returning primary key of new line
- Scala Slick Class Table Definition, Error Declaring DateTime columns
- Using a case class with map and nested case class with gremlin scala
- How can I map a case class with non-default field types to a table in Slick?
- Unable to load hive table from spark dataframe with more than 25 columns in hdp 3
- read cassandra table with UDT with null values, and map to Scala case class, in Spark
- case class more than 22 fields to CSV in Scala
- Scala + Slick how to map json to the table column with datatype as blob
- How do I define Json format for a case class with more than one apply method?
- How to update a mongo record using Rogue with MongoCaseClassField when case class contains a scala Enumeration
- Case class to map in Scala
- Scala case class extending Product with Serializable
- Read CSV in Scala into case class instances with error handling
- Slick 3.1 - Retrieving subset of columns as a case class
- Custom mapping to nested case class structure in Slick (more than 22 columns)
More Query from same tag
- Replacing/deleting some characters from a string which is in a List
- Difference between an undefined abstract type member and an existential type
- Scalactic Accumulation's withGood of Seq
- How can I add 2 (pyspark,scala) steps together in AWS EMR?
- Why can't Option.fold be used tail recursively in Scala?
- Type mismatch error : Scala
- How to avoid calling asInstanceOf in Scala
- Print list items without flatMap
- How to use Spark-Submit to run a scala file present on EMR cluster's master node?
- Assert unique key constraint on h2 database with slick and scalatest
- Scala 2.10 - Octal escape is deprecated - how to do octal idiomatically now?
- How can I create an Array of fitted PipelineModelS in Spark 2.1.0?
- How to properly terminate a graph that contains a loop?
- Unable to create a variable of an Enumeration
- Executing an Init Script Using Testcontainers Postgres for Scala
- Scala type ascription for varargs using _* cause error
- Equivalent of R's reshape2::melt() in Scala?
- Getting error in spark-sftp jar in pyspark cluster yarn submit
- Type safe method chaining that doesn't allow repeats of operations
- print CoordinateMatrix after using RowMatrix.columnSimilarities in Apache Spark
- Kafka Streams how to get TimeStamp from kafka message in scala
- How to get ScalaCheck's Arbitrary to always generate some special case values?
- Surround failing higher order function with try catch?
- Using .join to remove certain lines in a db in spark scala
- Scala does not Compile when clicking Run IntelliJ
- Avoiding explicit delegation when using implicits to implement a Trait
- Scala sort a set with ordering
- Dependency injection into scala objects (not classes)
- Scala: force method calls in class hierarchy
- Convert Spark Dataframes each row as a String with a delimiter between each column value in scala