This documents how to deploy a Hadoop cluster and a Spark cluster on top of a Swarm cluster: on each vm a Hadoop and a Spark node are deployed
3 vms participating to a Swarm cluster. Install docker on each vm and initialize a Swarm cluster on one vm, then let other vms join the cluster.
docker network create -d overlay --attachable network
docker stack deploy -c docker-compose-hadoop.yml hadoop
docker stack deploy -c docker-compose-spark.yml spark
docker stack deploy -c docker-compose-services.yml services