Spark 2.4 standalone 部署
阿新 • • 發佈:2018-12-27
1 安裝 Spark
使用如下命令下載 Spark
,下載地址:
wget http://mirrors.hust.edu.cn/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz
解壓 tgz
包:
tar zxvf spark-2.4.0-bin-hadoop2.7.tgz
修改環境變數並宣告:
vim ~/.bashrc export SPARK_HOME=$HOME/spark-2.4.0-bin-hadoop2.7 export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin source ~/.bashrc
開啟配置檔案:
cd $SPARK_HOME/conf
cp spark-env.sh.template spark-env.sh
vim spark-env.sh
新增變數:
export JAVA_HOME=/usr/java/jdk1.8.0_191-amd64
export SCALA_HOME=/usr/java/scala-2.11.8
export SPARK_HOME=/root/spark-2.4.0-bin-hadoop2.7
export SPARK_MASTER_IP=** your ip **
export SPARK_EXECUTOR_MEMORY=1G
2 部署 Standalone Spark
啟動 spark
:
cd $SPARK_HOME
./sbin/start-all.sh
檢查是否啟動成功(使用 spark
客戶端):
./bin/spark-shell
輸出以下表示成功:
2018-12-26 11:20:33 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://spark-2:4040 Spark context available as 'sc' (master = local[*], app id = local-1545841253565). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.4.0 /_/ Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_191) Type in expressions to have them evaluated. Type :help for more information. scala>