This example (org.apache.spark.examples.streaming.FlumeEventCount) is worked:

// Create the context and set the batch size
val sparkConf = new SparkConf().setAppName("FlumeEventCount")
val ssc = new StreamingContext(sparkConf, batchInterval)

// Create a flume stream
val stream = FlumeUtils.createStream(ssc, host, port, StorageLevel.MEMORY_ONLY_SER_2)

// Print out the count of events received from this server in each batch
stream.count().map(cnt => "Received " + cnt + " flume events." ).print()

some hints:

  • use val instead of var
  • use exact ip instead of hostname or modify /etc/hosts in related nodes

Related Query

More Query from same tag