spark2.3.1讀取hbase執行報錯的解決辦法
阿新 • • 發佈:2019-01-07
spark2.3.1讀取hbase執行報錯的解決辦法
1.報錯java.lang.IllegalStateException: unread block data
解決辦法:
spark.driver.extraClassPath
spark.executor.extraClassPath
要賦值/usr/cwgis/app/spark/jars/lib/*
為應用程式生成的依賴庫
sparkConf.set("spark.driver.extraClassPath","/usr/cwgis/app/spark/jars/lib/*"); sparkConf.set("spark.executor.extraClassPath","/usr/cwgis/app/spark/jars/lib/*");
其它SparkConf引數
String master = "spark://mycluster:7077"; SparkConf sparkConf=new SparkConf(); sparkConf.setMaster(master); //sparkConf.setMaster("local"); //yarn-cluster sparkConf.setJars(JavaSparkContext.jarOfClass(this.getClass())); sparkConf.setAppName("geowaveSpark"); sparkConf.set("spark.dynamicAllocation.enabled","false"); sparkConf.set("spark.driver.extraClassPath","/usr/cwgis/app/spark/jars/lib/*"); sparkConf.set("spark.executor.extraClassPath","/usr/cwgis/app/spark/jars/lib/*");
2.報錯SparkException: Could not find CoarseGrainedScheduler or it has been stopped
sparkConf.set("spark.dynamicAllocation.enabled","false");
3.報錯java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream
排除庫中引用,一般為kafka庫中的jar,
然後重新生成依賴庫,分發到分散式機群中/usr/cwgis/app/spark/jars/lib
<dependency> <groupId>mil.nga.giat</groupId> <artifactId>geowave-adapter-vector</artifactId> <version>0.9.7</version> <exclusions> <exclusion> <groupId>net.jpountz.lz4</groupId> <artifactId>lz4</artifactId> </exclusion> </exclusions> </dependency>
—the—end—