1. 程式人生 > >hive執行報錯

hive執行報錯

執行hive時報錯,即使最簡單的show databases也報錯:

hive> show databases;
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

查了各種各樣的原因都沒解決,後來想重新初始化一下,發現了還有這樣的命令schematool -dbType mysql -info

可以直接檢視hive的情況,所以就執行了一下該命令,結果報錯了:

org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: java.sql.SQLException : The server time zone value 'EDT' is unrecognized or represents more than one time zone. You must configure either the server or JDBC driver (via the serverTimezone configuration property) to use a more specifc time zone value if you want to utilize time zone support.
SQL Error code: 0
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

這個問題在我初始化的時候也遇到過,是時間差的問題,照如下步驟進行操作:

  1. 在終端執行date命令,檢視時間
  2. mysql -uroot -ppassword啟動mysql
  3. 在mysql中執行show variables like '%time_zone%';
  4. 輸入select nows();
  5. 發現終端時間和mysql的時間不一致(也有可能時間看著差不多,其實有時間差),這就是問題的原因了。
  6. 然後在mysql中執行set time_zone=SYSTEM;
  7. mysql中繼續執行set global time_zone='+8:00';
  8. 執行flush privileges;進行重新整理。

按步驟執行完畢後,退出mysql,再次執行schematool -dbType mysql -info

,發現顯示:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/apps/apache-hive-2.3.3-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/apps/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:        jdbc:mysql://192.168.186.101:3306/hive?useSSL=false
Metastore Connection Driver :    com.mysql.cj.jdbc.Driver
Metastore connection User:       root
Hive distribution version:       2.3.0
Metastore schema version:        2.3.0
schemaTool completed

成功了,然後再次執行hive,就能正確執行任務。