Spark & Kafka docker-compose

To quickly launch spark and kafka for local development, docker would be the top choice as of its flexibily and isolated environment. This save lot of time for manually installing bunch of packages as well as conflicting issues.

Prequesite

It requires your local computer installed the following:

Docker compose

The docker-compose includes two main services Kafka and Spark simply described as below:

Here is the list of container after starting docker-compose

Access to spark master via http://localhost:8080

Deserialize Avro Kafka message in pyspark Setup Spark local development

Comments

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×