Uses Apache Flink docker image with Kinesalite (and maybe Kafka)
- Clone This Repo
- Install IntelliJ
- Install Docker
- Run included Docker Image
- Set up local kinesis stream
- Modify Code Samples
- Run Code Samples
- Enjoy
This is where you will run the code in this repo--you can choose to use any IDE you'd like, but IntelliJ is uniquely suited for running Flink workloads in that it will spin up a mini Apache Flink cluster on your behalf.
Docker will allow you to run a Kinesalite image locally on your machine!
You also need to install Docker Compose to run the provided docker-compose template.
Once you've installed docker, either from your IntelliJ Terminal or your local machine's terminal, navigate to the git project's root and type the following:
docker-compose up -d
This will start your kinesalite process at the following url:
https://localhost:4567
The -d
denotes a DAEMON process.
Execute the following two commands in succession in a terminal to create a local stream called my-local stream
and then publish data under the --data
field input.
aws kinesis create-stream --endpoint-url https://localhost:4567 --stream-name my-local-stream --shard-count 6 --no-verify-ssl
aws kinesis put-record --endpoint-url https://localhost:4567 --stream-name my-local-stream --data mytestdata --partition-key 123 --no-verify-ssl
In /src/main/java/Kinesis_StreamingJob.java
you can modify the code to point to the specific data stream you have created in the previous step.
Note: Ensure you are using Jave 8 or higher, and on your run configuration, include provided sources.
In Intellij, Hit run, and you can continue publishing messages to see the Flink app print out the length of a string passed.
aws kinesis put-record --endpoint-url https://localhost:4567 --stream-name my-local-stream --data myverylongstringdata --partition-key 123 --no-verify-ssl
Please note: the ssl
folder in this repo is a test credential that is required for running kinesalite locally due to how the AWS CLI works.