score:2
udf cannot return row
objects. return type has to be one of the types enumerated in the column value type in scala in the data types table.
good news is there should be no need for udf here. if object1
and object2
have the same schema (it wouldn't work otherwise anyway) you can use array
function:
import org.apache.spark.sql.functions._
df.select(array(col("object1"), col("object2"))
or
df.select(array(col("path.to.object1"), col("path.to.object2"))
if object1
and object2
are not top level columns.
score:0
i would like to suggest one alternative way which can be used if schema for object1 and object2 are different and you get to return the row. basically to return row , you simply return a case class having the schema of row objects which in this case is object1 and object2 which themselves seem to be rows
so do the following
case class object1(<add the schema here>)
case class object2(<add the schema here>)
case class record(object1:object1,object2:object2)
now inside the udf , you can create object1 and object2 using firstobject and secondobject
then
val record = record(object1,object2)
then you can return the record
in this you can return rows even if schema not same or some processing required.
i know that this doesn't actually pertain to your question , but this question seemed a correct opportunity to tell about this concept.
Source: stackoverflow.com
Related Query
- rdd.mapPartitions to return a Boolean from udf in Spark Scala
- Define return value in Spark Scala UDF
- How to return value from onComplete Success case with parallel queries with spark and scala to build DataFrame?
- udf spark Scala return case class
- Spark SQL UDF from a string which represents scala code at runtime
- Spark scala - parse json from dataframe column and return RDD with columns
- Scala UDF fails when called from a SELECT statement in DataBricks / Spark
- Scala - Spark : return vertex properties from particular node
- Error while calling udf from within withColumn in Spark using Scala
- How to return ListBuffer as a column from UDF using Spark Scala?
- scala spark dataframe modify column with udf return value
- How are Scala collections able to return the correct collection type from a map operation?
- get min and max from a specific column scala spark dataframe
- Read Parquet files from Scala without using Spark
- Scala return value from onComplete
- Dropping multiple columns from Spark dataframe by Iterating through the columns from a Scala List of Column names
- Scala and Spark UDF function
- How to create a Row from a List or Array in Spark using Scala
- How to use a PySpark UDF in a Scala Spark project?
- Spark scala data frame udf returning rows
- Scala Spark : How to create a RDD from a list of string and convert to DataFrame
- Moving Spark DataFrame from Python to Scala whithn Zeppelin
- Register UDF to SqlContext from Scala to use in PySpark
- How to read json data using scala from kafka topic in apache spark
- Spark Scala Get Data Back from rdd.foreachPartition
- Re-using A Schema from JSON within a Spark DataFrame using Scala
- Early return from a Scala constructor
- Null values from a csv on Scala and Apache Spark
- Scala return value from match case
- UDF to extract only the file name from path in Spark SQL
More Query from same tag
- Java Variable Length Arguments to Scala
- upickle gives a ScalaReflectionException when writing a case class
- How to run a shell command in Bluemix Spark notebook running scala?
- Execute two fs2 tasks concurrently (non-determenistically)
- Converting List[Option[A]] to an Option[List[A]] in Scala
- How to println(something) but doesn't print anything to the console
- Is there a way I can get the Gatling "Report ID"?
- Unexpected Future.map() execution order
- Why case pattern matching anonymous functions can have two signatures, and have magical call-site compatibility
- How to make Mockito verify work with Enumeration
- Best way to reflect case class fields in scala 2.11?
- scala import library wildcard
- Scala, Casbah: MongoCollection.insert compilation errors
- Scala-GWT status
- Value :: is not a member of type parameter [T] Scala?
- Selectively run tests in broken project using SBT
- JavaFX 2 and setCellValueFactory() in Scala?
- Type constructor for subtracting HList types?
- How do I take a list of keys that contains the value of the inputted key in the table in Scala
- How to define Jackson databinding dependency in build.sbt?
- Sbt stuck at 'Updating ProjectRef' after deleting .ivy folder
- How to pass text file as argument scala
- Play Framework 2.4.1: How to remove an element from a JsArray
- Get specific number of rows ordered by date for each key
- UDF to filter a map by key in Scala
- How to catch an exception and throw a different exception
- Gatling configure base url in configuration file
- How to map a string to list of characters
- Why not have Lists build the other way around and have them append elements?
- Is it possible to limit a generic parameter to a subset of a super type?