This is not how a KTable works.
A KTable itself, has an internal state store and stores exactly one record per key. However, a KTable is constantly updated and subject to the so-called stream-table-duality. Each update to the KTable is sent downstream as a changelog record: https://docs.confluent.io/current/streams/concepts.html#duality-of-streams-and-tables. Thus, each input record result in an output record.
Because it's stream processing, there is no "last key per value".
I have a topic that has a stream of data coming to it. What I need is to create a separate topic from this topic that only has the latest set of values given the keys.
At which point in time do you want a KTable to emit an update? There is no answer to this question because the input stream is conceptually infinite.
- How to store only latest key values in a kafka topic
- How to keep N latest values for key in kafka topic using kafka streams
- How to consume only latest offset in Kafka topic
- How do I convert a dataframe to JSON and write to kafka topic with key
- How to modify KStream key and values in Kafka word count program?
- How to configure a Kafka topic for use as snapshot store
- How to stream a single topic of kafka , filter by key into multiple location of hdfs?
- How to generate the random values of map from a given set of values and then store the key and values into separate variables in scala
- How to transform a Kafka Stream event and send it to another topic only if could be transformed
- How to iterate over key values of a Kafka Streams Table
- how to create key value RDD out of kafka topic data
- How to write spark streaming DF to Kafka topic
- How do I initialize object vals with values known only at runtime?
- How to read records from Kafka topic from beginning in Spark Streaming?
- How to read json data using scala from kafka topic in apache spark
- Akka Actor how to only process the latest message
- How to sum values and group them by a key value in Scala's List of Map?
- How to stream data from Kafka topic to Delta table using Spark Structured Streaming
- How to get a scala Map value, when the key is a tuple in which only the first element is known?
- Scala How To Use Map To Store Methods as values
- In scala, how to get an array of keys and values from map, with the correct order (i-th key is for the i-th value)?
- How to read text file using Scala(spark) line by line and split using delimiter and store values in respective columns?
- How to extract values from key value map?
- How to store HashMap values separately in scala?
- Spark Streaming + Kafka: how to check name of topic from kafka message
- Scala DataFrame - How to only print rows with largest values
- how to store object state before and after setting attribute values
- How to replace all the values with the same key in a JSON tree
- How to change the key of a KStream and then write to a topic using Scala?
- How do I extract values from a kafka row via spark under structured streaming?
More Query from same tag
- Limiting maximum size of dataframe partition
- PlayFramework 2.0 - Not able to call functions from other templates
- Optimal design of Spark job, and ensuring narrow dependencies on pre-partitioned data
- Overloading on return type?
- elastic4s org.elasticsearch.client.transport.NoNodeAvailableException:
- Inference of type parameters on mixins
- Mock case class in scala
- Scala Play framework points multiple trust managers at the PEM file using Play WS
- 1.+(2) in scala gives the correct answer.How?
- Play Framework: incompatible types: java.util.List<models.Vehicle> cannot be converted to java.lang.String
- Where to find tutorials for scaladoc 2?
- How should an akka actor be created that might throw an exception?
- Extract possible `null` value from JSON string and put in tuple for unzip later
- Creating AST for arithmetic expression in Scala
- What version of scalatest to use with Akka 2.5.11?
- override typesafe configuration list to empty on command line
- How can compile a chisel code? Is there any online compiler?
- Merge two Seq to create a Map
- Speeding up a pattern matching algorithm in scala on a big csv file
- DataStax Spark Cassandra Connector fails to connect when mixing contact points across regions
- Does Spark .load() all data into DF and then performing .select("fields")?
- Does Apache thrift work with Scala
- Weird scala tuple behavior
- MQTT Structured Streaming
- Scala + Casbah object mapping good practices
- Scala - Use predicate function to summarize list of strings
- scala: define a variable of type any in a trait
- Universally Quantified Types in Haskell and Scala?
- Scala: Generate dates in a dataframe
- Rendering images with Processing.org on Java servlet