Skip to content

jeremyber-aws/local-development-with-flink-and-kinesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

local-flink-with-docker

Uses Apache Flink docker image with Kinesalite (and maybe Kafka)

  1. Clone This Repo
  2. Install IntelliJ
  3. Install Docker
  4. Run included Docker Image
  5. Set up local kinesis stream
  6. Modify Code Samples
  7. Run Code Samples
  8. Enjoy

Install IntelliJ

This is where you will run the code in this repo--you can choose to use any IDE you'd like, but IntelliJ is uniquely suited for running Flink workloads in that it will spin up a mini Apache Flink cluster on your behalf.

Install IntelliJ

Install Docker

Docker will allow you to run a Kinesalite image locally on your machine!

Install Docker

You also need to install Docker Compose to run the provided docker-compose template.

Run the docker image

Once you've installed docker, either from your IntelliJ Terminal or your local machine's terminal, navigate to the git project's root and type the following:

docker-compose up -d

This will start your kinesalite process at the following url: https://localhost:4567

The -d denotes a DAEMON process.

Set up local Kinesis Stream

Execute the following two commands in succession in a terminal to create a local stream called my-local stream and then publish data under the --data field input.

aws kinesis create-stream --endpoint-url https://localhost:4567 --stream-name my-local-stream --shard-count 6 --no-verify-ssl
aws kinesis put-record --endpoint-url https://localhost:4567 --stream-name my-local-stream --data mytestdata --partition-key 123 --no-verify-ssl

Modify code samples

In /src/main/java/Kinesis_StreamingJob.java you can modify the code to point to the specific data stream you have created in the previous step.

Run code samples

Note: Ensure you are using Jave 8 or higher, and on your run configuration, include provided sources.

Hit on Run

In Intellij, Hit run, and you can continue publishing messages to see the Flink app print out the length of a string passed.

aws kinesis put-record --endpoint-url https://localhost:4567 --stream-name my-local-stream --data myverylongstringdata --partition-key 123 --no-verify-ssl

Please note: the ssl folder in this repo is a test credential that is required for running kinesalite locally due to how the AWS CLI works.

About

Uses Apache Flink docker image with Kinesalite (and maybe Kafka)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages