spark程式依賴guava衝突報錯
在練習 spark 讀取本地檔案的過程中,遇到了這樣的問題:
Exception in thread "main" java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.<init>()V from class org.apache.hadoop.mapred.FileInputFormat
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:312)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:200)
at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:46)
at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2099)
at org.apache.spark.rdd.RDD$anonfun$collect$1.apply(RDD.scala:945)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
at com.baidu.devprofile.mysql.service.SparkSql.readFile(SparkSql.java:39)
at com.baidu.devprofile.Application.main(Application.java:13)
18/12/17 20:00:00 INFO SparkContext: Invoking stop() from shutdown hook
問題原因:
一般來說這樣的問題是因為 pom 檔案中有 google 的 guava 依賴吧?或者其他地方引用到了。 這個問題就是 guava 衝突了,版本不一致。所以大家優先去 pom 檔案中排查。
有可能你會發現POM中並沒有任何問題,那就去依賴包管理介面,比如idea的external Libraries檢視依賴的jar是否有guava,我的專案裡面能看見有一個19版本的
這個貌似和spark不相容,需要低版本到的guava,在pom裡面新增一個15版本的,再次執行,發現沒問題了。
<dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>15.0</version> </dependency>
查了很久沒找到具體原因,只知道是版本不合適。
為什麼spark沒有預設引入guava的包了?很疑惑,求大神賜教