Spark – consumer from Kafka

This tutorial illustrate how to consume messages from kafka using spark shell. Lets start installing kafka:

At this point kafka is up and runnig.

Create a new sbt project, edit *.sbt and add the following lines

Then

Now copy the following jars

Start spark shell

On another shell, create kafka.scala in /usr/local/spark

Turn back on spark shell and write

This is the output (I have configured spark log to avoid info on console, so your output might be more verbose)

On kafka server go in kafka root directory and launch

Write some lines and you will see them on spark shell 🙂

You can even use kafka producer to read a log file

Bye

 

Posted in Kafka, Scala, Spark, Tutorial.