Skip to content

deploy a Hadoop cluster and a Spark cluster on top of a Swarm cluster

Notifications You must be signed in to change notification settings

jpchev/swarm-hadoop-spark

Repository files navigation

Deploying Hadoop and Spark on a Swarm Cluster

This documents how to deploy a Hadoop cluster and a Spark cluster on top of a Swarm cluster: on each vm a Hadoop and a Spark node are deployed

Schéma d'architechture

prerequisites:

3 vms participating to a Swarm cluster. Install docker on each vm and initialize a Swarm cluster on one vm, then let other vms join the cluster.

network:

	docker network create -d overlay --attachable network

hadoop:

	docker stack deploy -c docker-compose-hadoop.yml hadoop

spark:

	docker stack deploy -c docker-compose-spark.yml spark

services:

	docker stack deploy -c docker-compose-services.yml services

About

deploy a Hadoop cluster and a Spark cluster on top of a Swarm cluster

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published