score:2
Accepted answer
The default decimal precesion is DecimalType(38, 18)
, I am not sure what you are trying to do but
You can cast the current decimal type as
df1.withColumn("map",udfJsonStrToMapDecimal($"json").cast("map<string, decimal(38,6)>"))
Schema:
root
|-- json: string (nullable = true)
|-- map: map (nullable = true)
| |-- key: string
| |-- value: decimal(38,6) (valueContainsNull = true)
Or you can also define the schema and read as JSON directly as
val schema = StructType(StructField("k", DecimalType(38, 6), nullable = false) :: Nil)
val df1 = Seq(("{\"k\":10.004}")).toDF("json")
val result = df1.withColumn("value", from_json($"json", schema))
Schema:
root
|-- json: string (nullable = true)
|-- value: struct (nullable = true)
| |-- k: decimal(38,6) (nullable = true)
Source: stackoverflow.com
Related Query
- BigDecimal setScale is not working in Spark UDF
- UDF not working in Spark SQL
- Spark UDF not working with null values in Double field
- UDF is not working to get file name in spark scala
- Logger is not working inside spark UDF on cluster
- Spark scala UDF in DataFrames is not working
- Spark - Hive UDF is working with Spark-SQL but not with DataFrame
- Spark 2.3 dynamic partitionBy not working on S3 AWS EMR 5.13.0
- Spark streaming is not working in Standalone cluster deployed in VM
- Spark not working with pureconfig
- Apache Toree and Spark Scala Not Working in Jupyter
- Spark DataFrames when udf functions do not accept large enough input variables
- where clause not working in spark sql dataframe
- Spark multiple dynamic aggregate functions, countDistinct not working
- SaveMode is not working in Spark SQL
- Spark streaming if(!rdd.partitions.isEmpty) not working
- File processing from HDFS to Spark not working
- Google secret manager API and google storage API not working with Apache Spark
- Does Spark Filter/Predicate Pushdown not working as intended in ORC file?
- Order By Timestamp is not working for Date time column in Scala Spark
- Scala Spark - udf Column is not supported
- SPARK_EXECUTOR_INSTANCES not working in SPARK SHELL, YARN CLIENT MODE
- Spark DataFrame filter not working as expected with Random
- Does spark not check UDF / Column type?
- spark filtering not working
- Spark SQL command not working with doubles
- calculating average in Spark streaming not working : issue w/ updateStateByKey and instantiating class
- Why does Spark cast udf-generated columns before calling another udf but not raw columns?
- Spark SQL UDF Task not serializable
- sortByKey() function in Spark Scala not working properly
More Query from same tag
- Most efficient way to format a string in scala with leading Zero's and a comma instead of a decimal point
- Is it possible to receive a tuple inside an actor without the warning "eliminated by erasure"
- What is the most efficient way to map JSON to Scala Objects?
- Is there a library for editing MP4 Metadata with Scala or Java?
- Map containing Objects not working as expected Scala
- Scala recursive API calls to get all the results
- How to check multiple String in a dataset in each row?
- Implicit conversion from List[Int] to List[Double] fails
- Scala Play Framework - Dynamic Forms
- Scala: Correcting type inference of representation type over if statement
- How to do subclass reflection in trait of scala
- List[OptionT[Future, Int]] to OptionT[Future, List[A]]
- How to covert a column with String to Array[String] in Scala/Spark?
- Testing Recursive Data Structure
- Scala 2.11.2 ScriptEngine throws error
- Anorm is just locking on executeUpdate
- Scala initialization behaviour
- Writing a variadic Scala predicate method
- Making Scala code interoperable with a Java library
- Why a scala test assertion on an object's val throws NullPointerException?
- Match capitalise and lower case
- Scala 2D Animation library
- Can I use a block when defining a Scala anonymous function?
- sbt assemblyMergeStrategy not working
- web hosts with scala installed
- Flink RMQSource
- catch an error on spark
- Input array in scala
- Empty while loop in Future causes Future to never return in Scala
- Link between MongoDB Casbah and Logback