score:0
1) the recommended way to check if a topic exists is to use the adminclient api.
you can use listtopics()
or describetopics()
.
2) assuming you don't have any privilege access to the cluster (to check metrics or liveness probes), the only way to check the cluster is running is to try connecting to/use it.
with the adminclient, you can use describecluster()
, for example, to attempt to retrieve the state of the cluster.
Source: stackoverflow.com
Related Query
- scala- how can I confirm specific topic exists in Kafka server(broker)?
- spark - scala - How can I check if a table exists in hive
- How to read json data using scala from kafka topic in apache spark
- Can I tell scala how to prefer more specific implicit, rather than give "ambiguous implicit" error?
- How can I create a new specific type of class when extending a trait in scala
- How can we write to a Kafka topic and read from a kafka topic locally from IDE
- How to read data from Kafka topic in Scala
- How can I use spark to writeStream data from a kafka topic into hdfs?
- Scala how can I count the number of occurrences in a list
- How can I easily get a Scala case class's name?
- How can I convert a Java Iterable to a Scala Iterable?
- How can I construct and parse a JSON string in Scala / Lift
- How can I convert Scala Map to Java Map with scala.Float to java.Float k/v conversion
- Scala: How can I replace value in Dataframes using scala
- How can I use a Scala singleton object in Java?
- How to write spark streaming DF to Kafka topic
- How can I find a description of scala compiler flags/options?
- How can Scala receive multiple parameters in a method definition?
- How can I get Intellij to stop underlining my scala code?
- How can one provide manually specialized implementations with Scala specialization?
- How can I match classes in a Scala "match" statement?
- How pure and lazy can Scala be?
- How can I tell what Scala version a .class file was compiled with?
- How can I access the last result in Scala REPL?
- How can I convert a json string to a scala map?
- How can I define an anonymous generic Scala function?
- How can I avoid mutable variables in Scala when using ZipInputStreams and ZipOutpuStreams?
- How to inform SBT to consume specific scala version for plugins?
- How can I use Gson in Scala to serialize a List?
- How can I get automatic dependency resolution in my scala scripts?
More Query from same tag
- How to mix in traits with implicit vals of the same name but different types?
- Scala compile error: value is not member of
- Play2.0: Restart on fatal error?
- Dockerfile for sbt ( scala )
- eclipse raises syntax error even though code compiles correctly
- Scala dynamic class management
- Spark doesn't recognize the column name in SQL query while can output it to a dataset
- High Frequency Trading in the JVM with Scala/Akka
- Data load from blob storage to sql data warehouse using azure databricks scala
- How to get all columns names in spark in a nested structure that are of type string
- How can I use Stream to traverse a tree in Scala?
- Scala Curried Type Mismatch
- What's the standard way to work with dates and times in Scala? Should I use Java types or there are native Scala alternatives?
- Maven Gatling 3.0.2 - NoSuchMethodError
- java_gateway.py file in send_command cannot send command for larger payload
- Access method of class above
- how to create scala case class with struct types?
- Transitive ordering of messages in Akka Actors
- Uncaught SyntaxError: Invalid or unexpected token in javascript with scala
- how shall I check that a Future is not null
- Scala initialize a constructor called more than once
- What is the equivalent of x[i:] in Scala?
- SBT: Exclude class from Jar
- [Scala][Spark]: transform a column in dataframe, keeping other columns, using withColumn and map [error: missing parameter type]
- Scala XML validation by XSD and parsing
- ScalaMock Inheritied Trait Function of Object ScalaTest
- Scalacheck prop type mismatch
- Flink Metrics name collision
- Can't connect to db in Play, connection string in application.conf
- scala spark check fields dataType