Configure Spark to Produce MapR Streams Messages

Using the Kafka 0.9 API, you can configure a Spark application to produce MapR Streams messages.

  1. Add the following dependency:
    groupId = org.apache.spark
    artifactId = spark-streaming-kafka-producer_2.10
    version = <spark_version>-mapr-<mapr_eco_version>
    Note: If you are using Spark 1.5.2-1602, specify the version as 1.5.2-mapr-1602.
  2. When you write the Spark program, import and use classes from org.apache.spark.streaming.kafka.producer._
    The import adds the following method to RDD or DStreams:
    sendToKafka[S <: Serializer[T]](
      topic: String,
      conf: ProducerConf
    ) 
    For example :
      val kafkaBrokers = "host:port,host:port"
    val producerConf = new ProducerConf(bootstrapServers = kafkaBrokers.split(",").toList)  
    val items = (0 until numMessages.toInt).map(i => Item(i, i))  
    val defaultRDD: RDD[Item] = ssc.sparkContext.parallelize(items)  
    val dStream: DStream[Item] = new ConstantInputDStream[Item](ssc, defaultRDD) 
    
    dStream.sendToKafka[ItemJsonSerializer](topics,producerConf)   
    dStream.count().print()