1. 程式人生 > 實用技巧 >java spark遠端呼叫問題

java spark遠端呼叫問題

從IntelliJ IDEA提交應用(submit Application),從spark webUI上能觀察到叢集在不停地add、remove Executor,無法正常執行。程式碼及截圖如下:

程式碼:

SparkConf conf = new SparkConf()
                .setSparkHome(sparkHome)
                .setAppName(appName)
        conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer
"); 或者 SparkSession spark = SparkSession.builder().master("spark://server01:7077").appName("HBASEDATA") .getOrCreate();

控制檯輸出資訊:

20/07/21 10:29:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-A56927L:4040
20/07/21 10:29:06 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://
server01:7077... 20/07/21 10:29:06 INFO TransportClientFactory: Successfully created connection to bikini-bottom/192.168.0.91:7077 after 149 ms (0 ms spent in bootstraps) ... ... 20/07/21 10:29:34 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asked to remove non-existent executor 13 20/07/21 10:29:34 INFO StandaloneSchedulerBackend: Granted executor ID app-20190721102906
-0002/14 on hostPort 192.168.0.91:46381 with 1 core(s), 800.0 MB RAM 20/07/21 10:29:34 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190721102906-0002/14 is now RUNNING 20/07/21 10:29:34 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20190721102906-0002/12 is now EXITED (Command exited with code 1) 20/07/21 10:29:34 INFO StandaloneSchedulerBackend: Executor app-20190721102906-0002/12 removed: Command exited with code 1 20/07/21 10:29:34 INFO BlockManagerMaster: Removal of executor 12 requested

Spark叢集Executor分配情況:

Executor的報錯資訊:

---------------------------------------------------------------------------------------------------------------------

解決:

配置主機:

修改程式碼:

SparkConf conf = new SparkConf()
                .setSparkHome(sparkHome)
                .setAppName(appName)
                //指定driver 的hosts-name
                .set("spark.driver.host","DESKTOP-T5HC2II")
                //指定driver的服務埠
                .set("spark.driver.port","9095")
                //記憶體大小
                .set("spark.executor.memory","800m")
                //CPU核心數
                .set("spark.driver.cores","1")
                .setMaster(master);
        conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
或者
        SparkSession spark = SparkSession.builder().master("spark://server01:7077").appName("HBASEDATA")
                //指定driver 的hosts-name
                .config("spark.driver.host","DESKTOP-T5HC2II")
                //指定driver的服務埠
                .config("spark.driver.port","9092")
                .getOrCreate();