Docker安裝Spark叢集(乾淨不含hadoop等)
阿新 • • 發佈:2019-01-12
https://github.com/mvillarrealb/docker-spark-cluster
1:安裝前準備
-
Docker 安裝好
-
Docker compose 安裝好
2:構建映象
下載下來放在/opt 目錄下 進入docker-spark-clouder目錄下 執行
chmod +x build-images.sh
./build-images.sh
接下來將會下載以下映象
-
spark-base:2.3.1: A base image based on java:alpine-jdk-8 wich ships scala, python3 and spark 2.3.1
-
spark-master:2.3.1: A image based on the previously created spark image, used to create a spark master containers.
-
spark-worker:2.3.1: A image based on the previously created spark image, used to create spark worker containers.
-
spark-submit:2.3.1: A image based on the previously created spark image, used to create spark submit containers(run, deliver driver and die gracefully).
3:執行docker-compose
執行 docker-compose up
接下來將會建立叢集
4:驗證叢集
Spark Master
Spark Worker 1
Spark Worker 2
Spark Worker 3