score:2
Accepted answer
it is not possible. only integer columns can be used here. if your database supports some variant of rowid, which is integer or can be casted to integer, you can extract it in a query (pseudocode):
(select cast(rowid as integer), * from table) as tmp
Source: stackoverflow.com
Related Query
- Can we set String column as partitionColumn?
- How to set column names to toDF() function in spark dataframe using a string array?
- Anorm string set from postgres ltree column
- Scala Spark How can I convert a column array[string] to a string with JSON array in it?
- Scala SparkSQL Create UDF to handle exception when column can be sometime struct and sometime string
- How can I add sequence of string as column on dataFrame and make as transforms
- Can we set remove column names from s3 partition path and set path to values?
- How can I take a column in a dataframe that is a Map type and create a string that is just the key/value of the Map column
- How can I take a column in a dataframe that is a Map type and create a string that is just the key/value of the Map column
- How I can replace every first character of the string in the column in spark scala?
- How can I change column types in Spark SQL's DataFrame?
- Can I access my Scala app's name and version (as set in SBT) from code?
- How can I cast Integer to String in Scala?
- How to set a String in a Option[String]?
- How can I construct and parse a JSON string in Scala / Lift
- How can I define a custom equality operation that will be used by immutable Set comparison methods
- Spark dataframe get column value into a string variable
- Where can I set proxy for SBT in Intellij IDEA?
- Spark: Convert column of string to an array
- How can I handle a > 22 column table with Slick using nested tuples or HLists?
- How can I convert a json string to a scala map?
- How can I use the new Slick 2.0 HList to overcome 22 column limit?
- How to change the column type from String to Date in DataFrames?
- Spark column string replace when present in other column (row)
- How do I filter rows based on whether a column value is in a Set of Strings in a Spark DataFrame
- Extract words from a string column in spark dataframe
- Why can a method returning Unit be overridden with method returning String when return types are not explicitly given?
- Is it possible and how to have var that can only be set once?
- How to convert a string column with milliseconds to a timestamp with milliseconds in Spark 2.1 using Scala?
- Can I set a timeout and number of retries on a specific pipeline request?
More Query from same tag
- Why does Gatling still sends requests when scenario injection is on nothingFor?
- How to declare (not define) a spark udf
- Kinds not conforming with type lambda
- Apache Spark - MLlib - K-Means Input format
- How are Scala collections able to return the correct collection type from a map operation?
- SQL JOIN to find mismatches
- Akka remote routees in scalable cluster
- Scala Trait Mixin order
- PlayFramework HTML, variable into Javascript?
- Play! Framework: Limit concurrent non-blocking IO calls made via WS
- How to avoid double logging with logback?
- How to add the schema in a dataframe from a config file
- Why does fold left expect (a -> b -> a) instead of (b -> a -> a)?
- Scala nested map filter
- Spark Parquet partitioning: Can I partition by a value of a given Map element?
- Adding new column using existing one using Spark Scala
- Read a dataframe from csv/json/parquet depending on the argument given in spark
- JsError while trying to parse JSON file into case class with Play framework API in Scala
- Using = in Unit methods
- What is the difference between these two expressions in a Scala anonymous function?
- scala how to convert future of one type to future of another type
- Difference in asymptotic time of two variants of flatten
- NodeSeq match fails, but equivalent Elem match succeeds -- why? how to fix?
- Filtering JSON with apache spark (NO SPARK SQL) - Scala
- Generic recursive type in Scala
- Scala Typeclasses with generics
- Is eq always called in scala ==?
- Spark: Single pipelined scala command better than separate commands?
- Spark (streaming) RDD foreachPartitionAsync functionality/working
- Value of a scala variable do not change while executing play Future query. Why?