1. 程式人生 > >Docker安裝Spark叢集(乾淨不含hadoop等)

Docker安裝Spark叢集(乾淨不含hadoop等)

  https://github.com/mvillarrealb/docker-spark-cluster

1:安裝前準備

  • Docker 安裝好

  • Docker compose 安裝好

2:構建映象

 將 docker-spark-cluster

下載下來放在/opt 目錄下 進入docker-spark-clouder目錄下 執行

chmod +x build-images.sh
./build-images.sh

 接下來將會下載以下映象

  • spark-base:2.3.1: A base image based on java:alpine-jdk-8 wich ships scala, python3 and spark 2.3.1

  • spark-master:2.3.1: A image based on the previously created spark image, used to create a spark master containers.

  • spark-worker:2.3.1: A image based on the previously created spark image, used to create spark worker containers.

  • spark-submit:2.3.1: A image based on the previously created spark image, used to create spark submit containers(run, deliver driver and die gracefully).

 3:執行docker-compose

   執行  docker-compose up

    接下來將會建立叢集

4:驗證叢集

Spark Master

http://10.5.0.2:8080/

alt text

Spark Worker 1

http://10.5.0.3:8081/

alt text

Spark Worker 2

http://10.5.0.4:8081/

alt text

Spark Worker 3

http://10.5.0.5:8081/

alt text