score:1
the error message is really not clear, although what's going on is a clash between the java syntax and the scala syntax for the tumbling window expression.
this is the java syntax for a tumbling window, and doesn't seem to be accepted by the scala api:
// this does not work in scala:
// val result = table.window(tumble.over(lit(4).second()).on($"pt").as("w"))
somehow when called from scala the "4" seems to end up wrapped one time too many and fails to be converted into the "4 seconds" duration,
this scala syntax solves it:
import org.apache.flink.table.api._
...
// this works in scala:
table.window(tumble.over(4.second()).on($"pt").as("w"))
in both cases it's the same second()
function that gets called, although in the second case its argument is having the expected type.
note that you can also have fun with this kind of syntax:
val result = table.window(tumble over 4.second on $"pt" as "w")
Source: stackoverflow.com
Related Query
- Scala Option Types not recognized in apache flink table api
- Is there a library for time interval composition in Java or Scala?
- Proper syntax for reading a CSV using Apache Flink (Scala)
- What structures to use to create constant access time immutable table in Scala?
- Apache Spark, How to get time interval
- Flink Table API & SQL and map types (Scala)
- Apache Flink output to csv file for each GroupedDataSet
- Apache Flink - Tweet Vectorization for SVM
- Create SQL Table from a DataStream in a Java/Scala program and query it from SQL Client CLI - Apache Flink
- flink Table SQL Api
- Invalid constant for day-time interval: org.apache.flink.table.api.ApiExpression
- Spark time in parallel execution in Spark for API call
- BigQuery Connector for Apache Spark - Update a Partitioned Table
- Using Apache Spark for fast computation process but writing into Database consumes huge amount of time
- flink window for time stamp
- Apache Flink TumblingProcessingTimeWindows firing at the wrong time
- flink error Could not find a suitable table factory for 'org.apache.flink.table.factories.BatchTableSourceFactory' in the classpath
- Jvm takes a long time to resolve ip-address for localhost
- Scala framework for a Rest API Server?
- Optimal way to create a ml pipeline in Apache Spark for dataset with high number of columns
- How do I declare a constructor for an 'object' class type in Scala? I.e., a one time operation for the singleton
- Is there a Scala wrapper for Apache POI?
- Scheduling a task at a fixed time of the day with Akka
- strange error message: bad symbolic reference. A signature in package.class refers to term apache in package org which is not available
- Are there any plugins for generating API documentation for Play 2.x?
- Apache Spark Dataframe Groupby agg() for multiple columns
- Using Apache Spark as a backend for web application
- Concatenate lists in constant time in scala?
- Reflection API for Scala
- Apache Spark 2.0: java.lang.UnsupportedOperationException: No Encoder found for java.time.LocalDate
More Query from same tag
- AOT compilation or native code compilation of Scala?
- Gzip Compression Format in Scala AWS Lambda code
- Convert Spark RDD to dataset
- Multiple project dependencies in SBT native packager
- Spark Shell allowing to redeclare the same immutable variable
- Code Explanation needed for scala placeholder call
- Wait until all Future.onComplete callbacks are executed
- How to increase RPS to 50k In gatling?
- Illegal start of simple expression creating a simple dataframe in spark
- Can compose parameterized Type object in scala?
- How to avoid return statement and escape from for loop?
- Why does scala cache ask me to define the TTL twice?
- Scala Play2 router/controller test without hitting database
- Argonaut decoder how to change the type of one value in a case class
- Case Class default apply method
- Securing REST API on Play framework and OAuth2
- Averaging values at same position in List
- How to transfer large number of files within ADLS Gen2/Blob using Databricks?
- What is repl sync in the scala console?
- scala call to linux lsof command searching directories?
- Spark Streaming Direct Kafka API, OffsetRanges : How to handle first run
- Preserve relation with upstream elements in akka stream
- Replace all occurrences of a String in all columns in a dataframe in scala
- How to Shuffle a List in Scala
- a better way to create a map from a list
- Spark Streaming filter condition inside foreach - NullPointerException
- Does `Random.nextString()` take O(n) time in Scala?
- scala DSLs and typed operators: idiomatic implementation?
- IntelliJ IDEA 2020.3 Scala plugin isn't working with new projects
- Scala FiniteDuration from String