1. 程式人生 > >sqoop 利用azkaban匯出hive資料到mysql的錯誤

sqoop 利用azkaban匯出hive資料到mysql的錯誤

ERROR tool.ExportTool: Error during export:  because

Export job failed!
    at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439)
    at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931)
    at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
    at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.

sqoop import export其實也是mapreduce ,然後問題出現在map階段執行完,export 到mysql中

大概原因是:hive表結構和mysql表結構不一致,主要表現在

欄位數量,欄位型別,非空約束等比如說hive表結構中有null,而mysql卻有非空約束,顯然會錯,

還有sqoop匯出到mysql需要指定export -dir 為hdfs上的路徑,確定hdfs路徑是否有問題,hdfs路徑下是否有多個檔案,

如果sqoop被azkaban執行的話,還需要在sqoop任務中新增jdk路徑,jdk路徑是否正確

export JAVA_HOME=/usr/local/jdk1.8
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$PATH