Skip to content

Latest commit

 

History

History
30 lines (23 loc) · 750 Bytes

README.md

File metadata and controls

30 lines (23 loc) · 750 Bytes

Deploying Hadoop and Spark on a Swarm Cluster

This documents how to deploy a Hadoop cluster and a Spark cluster on top of a Swarm cluster: on each vm a Hadoop and a Spark node are deployed

Schéma d'architechture

prerequisites:

3 vms participating to a Swarm cluster. Install docker on each vm and initialize a Swarm cluster on one vm, then let other vms join the cluster.

network:

	docker network create -d overlay --attachable network

hadoop:

	docker stack deploy -c docker-compose-hadoop.yml hadoop

spark:

	docker stack deploy -c docker-compose-spark.yml spark

services:

	docker stack deploy -c docker-compose-services.yml services