1. 程式人生 > >hive啟動報錯 hive.metastore.HiveMetaStoreClient

hive啟動報錯 hive.metastore.HiveMetaStoreClient

之前用的是hive-0.90 ,想與hbase整合下,所以更換hive為0.13.1版本,因為偷懶將原來的conf配置檔案拷貝,
結果出現如下錯誤,hive.metastore.local,hive.metastore.ds.retry.*是新版本不建議使用,刪除配置檔案中的內容即可

java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
javax.jdo.JDOFatalUserException: Class org.datanucleus.jdo.JDOPersistenceManagerFactory was not found.
這些是因為偷懶造成的問題,兩個版本的conf配置內容是不一樣的,所以還是手動去修改吧。

[
[email protected]
conf]$ hive
15/07/20 11:36:12 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
15/07/20 11:36:12 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.*
no longer has any effect.  Use hive.hmshandler.retry.* instead

Logging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
        ... 7 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
        ... 12 more
Caused by: javax.jdo.JDOFatalUserException: Class org.datanucleus.jdo.JDOPersistenceManagerFactory was not found.
NestedThrowables:
java.lang.ClassNotFoundException: org.datanucleus.jdo.JDOPersistenceManagerFactory
        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1175)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
        at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
        at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
        at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
        ... 17 more
Caused by: java.lang.ClassNotFoundException: org.datanucleus.jdo.JDOPersistenceManagerFactory
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:270)
        at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)
        at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)
        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)
        ... 36 more

手動修改後:
[[email protected] conf]$ hive
Logging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.properties
hive>

OK了,記錄下幫助需要的朋友。

相關推薦

hive啟動 hive.metastore.HiveMetaStoreClient

之前用的是hive-0.90 ,想與hbase整合下,所以更換hive為0.13.1版本,因為偷懶將原來的conf配置檔案拷貝, 結果出現如下錯誤,hive.metastore.local,hive.metastore.ds.retry.*是新版本不建議使用,刪除配置檔案中

Hive啟動spark-assembly-*.jar: No such file or directory

安裝完成spark後,啟動hive,發生錯誤: ls: cannot access /opt/soft/spark-2.1.3-bin-hadoop2.6/lib/spark-assembly-*.jar: No such file or directory 問題原因:   新

hive啟動

首先,hive在安裝配置完之後,在啟動之前,要先啟動hadoop。在zookeeper實現高可用的情況下,還要先啟動zookeeper,再啟動hadoop。 hadoop啟動之後,由於是初次安裝hive,因此要初始化元資料庫。使用的命令為schematool -dbType

hive啟動之2、The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH.

報錯資訊如下: [[email protected] bin]$ ./hive Missing Hive Execution Jar: /opt/soft/apache-hive-1.1.0-cdh5.7.0-bin/lib/hive-exec-*.jar 考慮的

hive啟動 java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf

bin/hive Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf 這裡分享一種可能 到hadoop的etc/hadoo

hive啟動:Unable to instantiate SessionHiveMetaStoreClient

在配置完apache-hive-1.2.2之後,啟動hive,提示如下錯誤:Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.h

Hive啟動(已解決!)

今天在測試的時候發現這樣一個小錯誤,分享如下: 錯誤原因:因為Hive中的真實資料是儲存在Hdfs上的,所以在啟動Hive前,需要先啟動Hadoop叢集,在啟動Hadoop叢集的時候,我同時啟動了Hive,導致叢集進入了安全模式。 解決辦法: 第一種方法:稍等一點時間,重新啟動Hive。

hive啟動 java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7B

啟動hive報錯: [[email protected] conf]# hive Logging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.properties

hive 資料來源 使用mysql; hive 啟動; 載入資料 建表等基本命令

-bash: se: command not found [[email protected] local]# service mysqld status mysqld is stopped [[email protected] local]# serv

Hive環境搭建啟動

hive-site.xml檔案內容: 1 <?xml version="1.0" encoding="UTF-8" standalone="no"?> 2 <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

大資料hive啟動:system:java.io.tmpdir

解決方法: 在hive下建立個tmpdir目錄 在hive-site.xml中新增以下內容 <property> <name>system:java.io.tmpdir</name> <value&

hive1.1.0 啟動Missing Hive Execution Jar:lib/hive-exec-*.jar

hive啟動時報下面的錯誤資訊 [[email protected] bin]# hive Missing Hive Execution Jar: /data0/hive/hive1.2.1/lib/hive-exec-*.jar 解決: 1、檢查/etc/profile檔

安裝atlas後執行hive命令

repeat log color bug mage client img sof atl 在集群中安裝atlas,在安裝atlas的節點上執行hive -e "show databases;" 正常,但是在集群中其他節點上執行hive -e "show database

hive啟動時報${system:java.io.tmpdir

hive system:java.io.tmpdirException in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${syst

hive啟動時報: Relative path in absolute URI: ${system:java.io.t

hive system:java.io.tmpdirException in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${syst

hive安裝

當我們輸入./schematool -initSchema -dbType mysql的時候,會出現以下錯誤 Metastore connection URL: jdbc:mysql://192.168.*./hive?createDatabaseIfNotExist=true Metastore Conn

sqoop執行hive匯入

錯誤日誌: 2018-09-17 14:50:57,932 INFO [OutputFormatLoader-consumer] com.chinacreator.sqoop.connector.hive.HiveLoader: load sql:LOAD DATA INP

Hive啟動時報:cannot access ....spark/lib/spark-assembly-*.jar: No such file or directoryog解決

在安裝spark後,通過bin/hive 啟動hive時報錯: ls:cannot access /export/servers/spark/lib/spark-assembly-*.jar: No such file or directoryog 原因分析:

hive執行

執行hive時報錯,即使最簡單的show databases也報錯: hive> show databases; FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveExcept

執行hive,解決經歷

今天執行hive,因為hadoop原來為分散式,然後改成偽分散式後,執行hive報錯 初步判斷是由於HA節點中處於standby狀態造成的異常 Operation category READ is not supported in state standby 關閉後stop-all.sh