The repository contains docker-compose files to run a Kafka cluster with different configurations.
This is a docker-compose file to run a Kafka cluster with Connect and S3 (MinIO) support. There's a sample connector that reads from a topic and writes to S3.
docker-compose -f cp-docker-compose.yml build
docker-compose -f cp-docker-compose.yml up
This image is based on the Confluent All-In-One Images.
- Integrate Flink.
This is a docker-compose file to run a Kafka cluster with Flink support.
docker-compose -f kafka-flink.docker-compose.yml build
docker-compose -f kafka-flink.docker-compose.yml up
Start the producer and give in some text as input (src/read_lines/producer.sh
):
docker exec -it broker kafka-console-producer.sh --bootstrap-server localhost:9092 --topic users
At the same time, start a consumer:
docker exec -it stream-stack-playground_jobmanager_1 flink run -py /opt/src/read_lines/consumer.py
Produce data with:
python src/flink_kafka/json_events/producer.py --num-events 1000
Datastream to datastream:
docker exec -it stream-stack-playground_jobmanager_1 flink run -py /opt/src/json_events/consumer_table.py
Datastream to table:
docker exec -it stream-stack-playground_jobmanager_1 flink run -py /opt/src/json_events/consumer_stream.py
docker exec -it stream-stack-playground_jobmanager_1 flink run -py /opt/src/tables/consumer.py