1. 程式人生 > >Spark 分布式環境---slave節點無法啟動(已解決)

Spark 分布式環境---slave節點無法啟動(已解決)

ogg dep open amd64 pan 安裝 div openjdk 文件

soyo@soyo-VPCCB3S1C:~$ start-slaves.sh 
soyo-slave01: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
soyo-slave01: failed to launch: nice -n 0 /usr/local2/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081
spark://soyo-VPCCB3S1C:7077 soyo-slave01: /usr/local2/spark/bin/spark-class: 行 71: /usr/lib/jvm/java-8-openjdk-amd64/bin/java: 沒有那個文件或目錄 soyo-slave01: full log in /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
解決:
修改 soyo-slave01 節點上bashrc裏JDK的安裝路徑(因為ubuntu14.04 不是安裝的默認openJDK)之後ok

Spark 分布式環境---slave節點無法啟動(已解決)