Skip to content

diffusion-playground/kafka-rest-integration-no-code

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 

Repository files navigation

Video Tutorial

Fine-grained fan-out and replication from a REST api to Kafka event cluster.

Introduction to Diffusion Real-Time Event Stream through a simple application using Diffusion Cloud, Apache Kafka and ESPN live REST api.

A simple projects illustrating real-time replication and fan-out of ESPN sports live events from ESPN live REST api to a Kafka event cluster, through Diffusion Cloud instance via the use of our Kafka Adapter and REST Adapter.

This tutorial will help you consume Sports Live Events from ESPN live REST api, filter the portion of data that you are interested in, and transform it on-the-fly via our powerful Topic Views. Then, simply deliver only the portion of the data you care about to a Kafka cluster, and reduce the data transfer by using our Data-Deltas compression, to deliver the same information with up to a 90% data savings.

This tutorial purely focus on the no-code solution to deliver event data between REST apis and Kafka clusters where not all the firehose data from the source needs to be replicated/fanned-out to Kafka.

Fine-grained distribution of Kafka event firehose with Topic Views

This tutorial introduces the concept of Topic Views, a dynamic mechanism to map part of a server's Topic Tree to another. This enables real-time data transformation before replicating to a remote cluster as well as to create dynamic data models based on on-the-fly data (eg: Kafka firehose data, or REST api data sources). This lesson also shows how to use our REST Adapter and our Kafka adapter to ingest and broadcast sporst data using Diffusion Topic Views in order to consume what you need.

Features used in this lesson

Step 1: Configure REST Adapter in Cloud to ingest from ESPN Sports live events

Step X: Configure Kafka Adapter in Cloud to ingest from Kafka Cluster A

Adapters > Kafka Adapter > Ingest from Kafka Config:

	Bootstrap Server > connect to you Kafka cluster A (eg: "kafka-sasl.preprod-demo.pushtechnology")
	Diffusion service credentials > admin, password (use the "Security" tab to create a user or admin account)
	Kafka Topic subscription > the source topic from your Kafka cluster (eg: "FXPairs")
	Kafka Topic value type > we are using JSON, but can be string, integer, byte, etc.
	Kafka Topic key type > use string type for this code example.

Step 2: Check the Kafka stream FXPairs is ingested

We can see the events from FXPairs Kafka topic (from cluster A) is now being published to Diffusion topic path: FXPairs. If there are no new events, it might be because the FXPairs topic has not received any updates from Kafka yet.

Step 3: Create Topic Views using Source value directives

Source value directives use the keyword expand() value directive to create multiple reference topics from a single JSON source topic, or value() directive to create a new JSON value with a subset of a JSON source topic.

We are going to map the topic FXPairs stream (we get from Kafka cluster A) to a new Diffusion Topic View with path: pairs/<expand(/value/pairs,/pairName)>/ where /value/pairs,/pairName is the Kafka payload currency pairName (part of the JSON structure in the FXPairs Kafka topic).

This Topic View will take care of dynamic branching and routing of event streams in real-time, by only sending the specific currency pair from Kafka cluster A, to Kafka cluster B, and not the whole stream.

Topic View Specification for Currency Breakout:
map ?FXPairs// to pairs/<expand(/value/pairs,/pairName)>/

Topic View Specification for Tiers Expansion:
map ?pairs// to tiers/<expand(/tiers)>/<path(1)>

Step 4: Dynamic branching and routing of Kafka events firehose

As new events are coming in from the Kafka cluster A firehose, Diffusion is dynamically branching and routing the currency pairs, on-the-fly when replicating and fan-out to Kafka cluster B.

Note: The topic path will dynamically change as new currency pair values come in.

The following image shows a Topic View for the following specification:
+pairs > map ?FXPairs// to pairs/<expand(/value/pairs,/pairName)>/
+tiers > map ?pairs// to tiers/<expand(/tiers)>/<path(1)>

When clicking on the "+" for "tiers" topic tree, the following image shows a Topic View for the following specification:
map ?pairs// to tiers/<expand(/tiers)>/<path(1)>

Suggested: 6 Lessons Using Topic Views

Lesson 1: Mapping Topics

Step 5: Configure Kafka Adapter in Cloud to broadcast to Kafka cluster B

Go to: Diffusion Cloud > Manage Service > Adapters > Kafka Adapter > Broadcast to Kafka

Adapters > Kafka Adapter > Broadcast to Kafka Config:

	Bootstrap Server > connect to you Kafka cluster A (eg: "kafka-plain.preprod-demo.pushtechnology")
	Diffusion service credentials > admin, password (use the "Security" tab to create a user or admin account)
	Topic subscription > the source topic from Diffusion to be broadcasted to Kafka cluster B (eg: from "tiers/2/GBP-USD" to "FXPairs.tier2.GBP.USD")
	Topic value type > we are using JSON, but can be string, integer, byte, etc.
	Kafka Topic key type > use string type for this code example.

Pre-requisites

  • Download our code examples or clone them to your local environment:
 git clone https://github.com/diffusion-playground/kafka-integration-no-code/
  • A Diffusion service (Cloud or On-Premise), version 6.6 (update to latest preview version) or greater. Create a service here.
  • Follow our Quick Start Guide and get your service up in a minute!

FX Data Generator

Really easy, just open the index.html file locally and off you go!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published