score:0
I don't have much knowledge of spark streaming but I believe streaming are iterative micro-batch, and in spark batch execution each action has one sink/output. So you can't store it in different tables with one execution.
Now,
- if you write it in one table, reader can simply read only the column that they require. I mean: do you really need to store it in different places?
- You can write it twice, filtering the fields that are not required
- both write action will execute the computation of the full dataset, then remove not required columns
- if the full dataset computation is long, you can cache it before the filtering+write
Source: stackoverflow.com
Related Query
- Spark Streaming | Write different data frames to multiple tables in parallel
- Spark Streaming | Write different data frames to multiple tables in Synapse DW
- How to read and write multiple tables in parallel in Spark?
- Not able to write Data in Parquet File using Spark Structured Streaming
- Is it possible to read multiple csv files with same header or subset of header in same or different order into spark data frame?
- Write and append Spark streaming data to a text file in HDFS
- Unable to write data from TCP port to HDFS using Spark Streaming
- Error: Using Spark Structured Streaming to read and write data to another topic in kafka
- Not able to write csv file by taking data from kafka topic in spark Streaming with scala
- Join Multiple Data frames in Spark
- Spark Streaming : Write Data to HDFS by reading from one HDFSdir to another
- Multiplying two columns from different data frames in spark
- Read from multiple topics and write to single topic - Spark Streaming
- Spark Scala: split data frame in multiple data frames after x rows
- How to write withColumnRenamed for all columns and join two different schema in custom partition in spark data frame
- Combine 2 data frames with different columns in spark
- How to create Multiple Data frames Dynamically in a For Loop in Spark Scala
- Write to multiple outputs by key Spark - one Spark job
- How to write spark streaming DF to Kafka topic
- How to write JDBC Sink for Spark Structured Streaming [SparkException: Task not serializable]?
- Converting multiple different columns to Map column with Spark Dataframe scala
- Can't write ordered data to parquet in spark
- Does Spark do one pass through the data for multiple withColumn?
- Multiple constructors with the same number of parameters exception while transforming data in spark using scala
- Comparing two data frames in Spark (performance)
- How to sort data in spark streaming
- Spark Structured Streaming Multiple WriteStreams to Same Sink
- How to normalize or standardize the data having multiple columns/variables in spark using scala?
- How to specify multiple tables in Spark SQL?
- Streaming data store in hive using spark
More Query from same tag
- Scala: Convert multiple functions to single generic function?
- How does one use quasiquotes to obtain the type of a value?
- Prevent type mismatch error indications in Intellij IDEA?
- Scala lazy val explanation
- Is using function in transformation causing Not Serializable exceptions?
- RDD.toLocalIterator eager evaluation
- How to convert an Array of Array containing Integer to a Scala Spark List/Seq?
- Play 2.0 + Bootstrap3: Showing active navigation item
- Spark LDA with scala and Java
- Getting the result of a Scala promise without a future
- Flatten future inside yield
- Ignoring corrupted Orc files when reading via Spark
- Merge Sets of Sets that contain common elements in Scala
- Play framework and Slick automatic database creation
- How do I convert an array of longs to bytes in Scala?
- Chaining function in Scala using dot notation
- How to design an abstract reader in Scala Spark?
- Creating a scala object from java: The constructor is undefined
- Convert Try to Future and recoverWith as Future
- How to Design and Call Scala API with Futures
- Type inference with type aliases and multiple parameter list function
- Get field value of a companion object from TypeTag[T]
- Playframework 2.1.2 how to prevent scala function calls on template load?
- Message not being sent when Akka actor is killed using DeathWatch
- SPARK-Read file with multi character delimiter with multiline option
- Cannot resolve errors for Scalatra project in Intellij IDEA IDE
- Spring Ehcache : Delete entries by a condition like endDate > now
- Timed out waiting for container port to open (localhost ports: [32773] should be listening)
- Documenting Scala functional chains
- Specifying the size of a HashMap in Scala