Spark 日誌錯誤資訊分析及解決方案:log4j、SLF4j
阿新 • • 發佈:2018-12-03
Spark 日誌錯誤資訊
異常資訊:(
解決了好久的問題
)
1、log4j錯誤類「org.apache.log4j.Appender」被載入,「org.apache.log4j.ConsoleAppender」不能分配給「org.apache.log4j.Appender」,導致sparkContext初始化失敗
log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [[email protected]] whereas object of type
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [[email protected]].
log4j:ERROR Could not instantiate appender named "console".
log4j:ERROR A "org.apache.hadoop.log.metrics.EventCounter" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [[email protected]] whereas object of type
log4j:ERROR "org.apache.hadoop.log.metrics.EventCounter" was loaded by [[email protected]].
log4j:ERROR Could not instantiate appender named "EventCounter".
log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [[email protected]] whereas object of type
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [[email protected]].
log4j:ERROR Could not instantiate appender named "console".
log4j:ERROR A "org.apache.log4j.varia.NullAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [[email protected]] whereas object of type
log4j:ERROR "org.apache.log4j.varia.NullAppender" was loaded by [[email protected]].
log4j:ERROR Could not instantiate appender named "NullAppender".
log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [[email protected]] whereas object of type
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [[email protected]].
log4j:ERROR Could not instantiate appender named "console".
log4j:ERROR A "org.apache.log4j.varia.NullAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [[email protected]] whereas object of type
log4j:ERROR "org.apache.log4j.varia.NullAppender" was loaded by [[email protected]].
log4j:ERROR Could not instantiate appender named "NullAppender".
log4j:ERROR A "org.apache.log4j.varia.NullAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [su[email protected]] whereas object of type
log4j:ERROR "org.apache.log4j.varia.NullAppender" was loaded by [[email protected]].
log4j:ERROR Could not instantiate appender named "NullAppender".
18/06/26 18:29:08 ERROR spark.SparkContext: Error initializing SparkContext.
2、SLF4J繫結型別和實際載入的類型別不匹配 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/spark-1.6.1/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;" the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, org/slf4j/LoggerFactory, and the class loader (instance of sun/misc/Launcher$AppClassLoader) for the method's defining class, org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type org/slf4j/ILoggerFactory used in the signature
原因分析: 1、多個log4j「jar包」被引用,導致應用程式環境的「jar」與叢集環境的「jar」版本不一致(衝突)導致異常 2、 應用程式初始化載入的 「SLF4J」的「jar」與 叢集環境繫結的「SLF4J」的「jar」包版本不一致,導致異常資訊 3、導致日誌錯誤資訊的根本原因 a、應用程式本身引入spark相關的依賴包
b、提交作業時,使用如下配置,導致應用程式本身的log4j首先初始化,然後跟叢集環境的log4j相關依賴包及配置資訊衝突,多次初始化,導致失敗 --conf spark.executor.userClassPathFirst=true \ --conf spark.driver.userClassPathFirst=true \
解決方案: 1、顯示引入「jar」包:org.slf4j的 slf4j.api、 slf4j-log4j12、log4j三個包,並且slf4j-log4j12需要去除slf4j.api、和log4j的引用。 2、同時需要將「slf4j」和「log4j」相關的「jar包」移除fight「jar」,並且移除「spark」相關「jar包」的引用。 如下: < dependency > < groupId >org.slf4j </ groupId > < artifactId >slf4j-api </ artifactId > < version >1.7.16 </ version > < scope > provided </ scope > </ dependency > < dependency > < groupId >org.slf4j </ groupId > < artifactId >slf4j-log4j12 </ artifactId > < version >1.7.16 </ version > <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> < scope > provided </ scope > </ dependency > < dependency > < groupId >log4j </ groupId > < artifactId >log4j </ artifactId > < version >1.2.17 </ version > < scope > provided </ scope > </ dependency > ... <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-sql_${scala.binary.version} </ artifactId > < version > ${spark.version} </ version > < scope > provided </ scope > </dependency> <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-mllib_${scala.binary.version} </ artifactId > < version > ${spark.version} </ version > < scope > provided </ scope > </dependency> <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-streaming_${scala.binary.version} </ artifactId > < version > ${spark.version} </ version > < scope > provided </ scope > </dependency> <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-streaming-kafka_2.10 </ artifactId > < version > 1.6.1 </ version > </dependency>
2、SLF4J繫結型別和實際載入的類型別不匹配 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/spark-1.6.1/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;" the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, org/slf4j/LoggerFactory, and the class loader (instance of sun/misc/Launcher$AppClassLoader) for the method's defining class, org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type org/slf4j/ILoggerFactory used in the signature
原因分析: 1、多個log4j「jar包」被引用,導致應用程式環境的「jar」與叢集環境的「jar」版本不一致(衝突)導致異常 2、 應用程式初始化載入的 「SLF4J」的「jar」與 叢集環境繫結的「SLF4J」的「jar」包版本不一致,導致異常資訊 3、導致日誌錯誤資訊的根本原因 a、應用程式本身引入spark相關的依賴包
b、提交作業時,使用如下配置,導致應用程式本身的log4j首先初始化,然後跟叢集環境的log4j相關依賴包及配置資訊衝突,多次初始化,導致失敗 --conf spark.executor.userClassPathFirst=true \ --conf spark.driver.userClassPathFirst=true \
解決方案: 1、顯示引入「jar」包:org.slf4j的 slf4j.api、 slf4j-log4j12、log4j三個包,並且slf4j-log4j12需要去除slf4j.api、和log4j的引用。 2、同時需要將「slf4j」和「log4j」相關的「jar包」移除fight「jar」,並且移除「spark」相關「jar包」的引用。 如下: < dependency > < groupId >org.slf4j </ groupId > < artifactId >slf4j-api </ artifactId > < version >1.7.16 </ version > < scope > provided </ scope > </ dependency > < dependency > < groupId >org.slf4j </ groupId > < artifactId >slf4j-log4j12 </ artifactId > < version >1.7.16 </ version > <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> < scope > provided </ scope > </ dependency > < dependency > < groupId >log4j </ groupId > < artifactId >log4j </ artifactId > < version >1.2.17 </ version > < scope > provided </ scope > </ dependency > ... <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-sql_${scala.binary.version} </ artifactId > < version > ${spark.version} </ version > < scope > provided </ scope > </dependency> <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-mllib_${scala.binary.version} </ artifactId > < version > ${spark.version} </ version > < scope > provided </ scope > </dependency> <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-streaming_${scala.binary.version} </ artifactId > < version > ${spark.version} </ version > < scope > provided </ scope > </dependency> <dependency> < groupId > org.apache.spark </ groupId > < artifactId > spark-streaming-kafka_2.10 </ artifactId > < version > 1.6.1 </ version > </dependency>