score:1
We finally found a workaround as follows:
According to the column
definition in slick
def column[C](n: String, options: ColumnOption[C]*)(implicit tt: TypedType[C]): Rep[C]
You can specify how the column is going to be translated between the driver and your code. If you want to use the out-of-the-box translations fine but for Oracle the translation for the CLOB
type doesn't seem to work properly.
What we did was to define the column as a String
but letting Slick to handle the translation with our custom code. The column definiton is the following:
def myClobColumn = column[String]( "CLOBCOLUMN" )( new StringJdbcType )
asd
Being StringJdbcType
our custom code to solve the translation between our String
to be inserted (up to 65535 bytes) and an Oracle CLOB
.
The code for StringJdbcType
is as follows:
class StringJdbcType extends driver.DriverJdbcType[String] {
def sqlType = java.sql.Types.VARCHAR
// Here's the solution
def setValue( v: String, p: PreparedStatement, idx: Int ) = {
val conn = p.getConnection
val clob = conn.createClob()
clob.setString( 1, v )
p.setClob( idx, clob )
}
def getValue( r: ResultSet, idx: Int ) = scala.io.Source.fromInputStream( r.getAsciiStream( "DSPOLIZARIESGO" ) )( Codec.ISO8859 ).getLines().mkString
def updateValue( v: String, r: ResultSet, idx: Int ) = r.updateString( idx, v )
override def hasLiteralForm = false
}
The setValue
function was our salvation because we could build an Oracle CLOB
with the already instantiated PreparedStatement
and the String comming from our domain. In our implementation we only had to do the plumbing and dirty work for the Oracle CLOB.
In sum, the extension point offered by Slick in driver.DriverJdbcType[A]
was what we actually used to make the thing work.
score:0
These are some improvements related to the solution: close resources and stream inspection
class BigStringJdbcType
extends profile.DriverJdbcType[String] {
def sqlType: Int = java.sql.Types.VARCHAR
def setValue(v: String, p: PreparedStatement, idx: Int): Unit = {
val connection = p.getConnection
val clob = connection.createClob()
try {
clob.setString(1, v)
p.setClob(idx, clob)
} finally {
clob.free()
}
}
def getValue(r: ResultSet, idx: Int): String = {
val asciiStream = r.getAsciiStream(idx)
try {
val (bufferEmpty, encoding) = getInputStreamStatus(asciiStream)
if (bufferEmpty) {
convertInputStreamToString(asciiStream, encoding)
} else ""
} finally {
asciiStream.close()
}
}
def updateValue(v: String, r: ResultSet, idx: Int): Unit =
r.updateString(idx, v)
override def hasLiteralForm: Boolean = false
}
Some utilities to complement the solution
def getInputStreamStatus(stream: InputStream): (Boolean, String) = {
val reader = new InputStreamReader(stream)
try {
val bufferEmpty = reader.ready()
val encoding = reader.getEncoding
bufferEmpty -> encoding
} finally {
reader.close()
}
}
def convertInputStreamToString(
stream: InputStream,
encoding: String
): String = {
scala.io.Source.fromInputStream(stream)(encoding).getLines().mkString
}
Source: stackoverflow.com
Related Query
- How to insert a Clob into a Oracle table with Slick 3 and Oracle 12?
- Slick - How to insert into table with no primaryKeys
- How to insert Array[Byte] into binary datatype column with slick and mariadb?
- How to do an sortBy with a field and a count(*) on a joined table in Slick 2?
- Scala Slick - Insert in table with omitting some columns and returning primary key of new line
- How make simple slick insert with Lifted query and Option columns
- How make simple slick insert with Lifted query and Option columns
- How to insert double quotes into String with interpolation in scala
- How can I handle a > 22 column table with Slick using nested tuples or HLists?
- How to apply manually evolutions in tests with Slick and Play! 2.4
- How to know if a Scala file modified with IntelliJ Idea is saved and if it is checked into CVS?
- How do I write slick table definitions with nullable columns?
- How to call Stored Procedures and defined functions in MySQL with Slick 3.0
- Insert data into a Hive table with HiveContext using Spark Scala
- How to merge two maps into one with keys from the first map and merged values?
- Slick 3 many to many relations: how to get all the element of a table and their relations if they exist?
- I've a table with Map as column data type, how can I explode it to generate 2 columns, one for map and one for key?
- How to read from textfile(String type data) map and load data into parquet format(multiple columns with different datatype) in Spark scala dynamically
- How to fetch records from the database using Play with Scala and Slick
- Slick 3: how to drop and take on collections with some relations
- Slick 1.0.1 batch insert how to ignore error and return success entries
- how to write insert query for table that contains reference column to another table in play-slick with scala?
- How to connect to Oracle DB with slick 3.0.1?
- Typesafe Slick and PostgreSQL 8.4: troubles with table while working in PostgreSQL interface
- How to do row insert conditionally, i.e. INSERT INTO with WHERE NOT EXISTS?
- How to insert TIMESTAMP WITH TIME ZONE into HSQLDB using JOOQ
- Slick 3 - Insert into multiple tables, transactionally, based on other table
- Slick 2.0: Cannot insert java.sql.Timestamp into a table using HList
- How to implement Multi-DB Pattern with Slick 3.1 and DDD
- Scala Slick How to insert rows with a previously returned autoincremental ID
More Query from same tag
- Submitting spark Jobs over livy using curl
- Querying by Comparing to ObjectId
- How to solve Bison optional else expression shift/reduce error?
- Is it correct to use `Future` to run some loop task which is never finished?
- Writing SQL in Play with Scala?
- Global variable within akka actor system in Scala
- storing the string from kafkaStream into a variable for processing
- Class 'SessionTrigger' must either be declared abstract or implement abstract member
- Changing clocking in RocketSubsystemModuleImp from System.scala
- Oauth 1.0 with Play framework - get instead of post
- Where/how Scala looks for implicit conversion methods?
- Scala code: Getting type mismatch error when using substring spark sql function?
- How to chain multiple case expressions together in Scala
- transform Future[Option[String]] to Future[String]
- How can I combine two array in this way
- What is implicit evidence in Scala? What is it good for?
- Scala Function Overloading Anomaly
- Using a Shapeless Nat returned from a function as a parameter for another function
- Can not resolve symbol Arbitrary
- SBT reloading dependencies each time
- SBT won't compile HelloWorld ScalaFX example, complains about javafx missing from the classpath
- Scala dropWhile vs filter
- Scala create a numeric from a string
- Sort list of Strings and Integers inplace Scala
- Spray Authentication method with BasicAuth
- Why this `case` is a must needed?
- How can I make a scala method parameter type that is a collection of multiple types that can be converted to a given type?
- Spark: Checking history of Rows in DataFrame
- Sorting scala list equivalent to C# without changing C# order
- Connect to Scala Websocket from Android Emulator with React native