Spark 1.6 (Java) 問題彙總
阿新 • • 發佈:2019-02-19
一、通過SparkSQL讀取Oracle時報找不到Oracle JDBC包 (java.lang.ClassNotFoundException:oracle.jdbc.driver.OracleDriver)
說明:
程式是本地通過Java將Spark提交給叢集執行,但是叢集上面沒有Oracle JDBC包
解決方法:
官網對於叢集執行JDBC的說明:
Troubleshooting
- The JDBC driver class must be visible to the primordial class loader on the client session and on all executors. This is because Java’s DriverManager class does a security check that results in it ignoring all drivers not visible to the primordial class
loader when one goes to open a connection. One convenient way to do this is to modify compute_classpath.sh on all worker nodes to include your driver JARs
最終找到方法需要在程式碼中指定Worker的jar依賴,當然jar包已經上傳到每臺Worker上,程式碼如下:
SparkConf conf = new SparkConf().setAppName("Test").setMaster("spark://master2:7077"); <span style="font-size:14px;color:#ff0000;">conf.set("spark.driver.extraClassPath", "/data/ojdbc14-10.2.0.3.0.jar"); conf.set("spark.executor.extraClassPath", "/data/ojdbc14-10.2.0.3.0.jar");</span> JavaSparkContext sparkContext = new JavaSparkContext(conf); sparkContext.hadoopConfiguration().set("fs.defaultFS", "hdfs://mycluster"); sparkContext.hadoopConfiguration().set("dfs.nameservices", "mycluster"); sparkContext.hadoopConfiguration().set("dfs.ha.namenodes.mycluster", "nn1,nn2"); sparkContext.hadoopConfiguration().set("dfs.namenode.rpc-address.mycluster.nn1", "xx.xx.xx.xxx:8020"); sparkContext.hadoopConfiguration().set("dfs.namenode.rpc-address.mycluster.nn2", "xx.xx.xx.xxx:8020"); sparkContext.hadoopConfiguration().set("dfs.client.failover.proxy.provider.mycluster","org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider"); SQLContext sqlContext = new SQLContext(sparkContext); Map<String, String> options = new HashMap<String, String>(); options.put("url", "jdbc:oracle:thin:mioclub/mioclub3728@(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=myrac-scan.mic.com.tw)(PORT=1521)))(CONNECT_DATA=(SERVICE_NAME=myrac.mic.com.tw)))"); options.put("dbtable", "mio_login_transfer"); options.put("driver", "oracle.jdbc.driver.OracleDriver"); DataFrame jdbcDF = sqlContext.read().format("jdbc"). options(options).load(); Row a = jdbcDF.first(); DataFrame d2 = jdbcDF.limit(10); d2.rdd().saveAsTextFile("/Test/sql.txt");