1. 程式人生 > 其它 >hadoop生態的kerberos認證系列4-hive

hadoop生態的kerberos認證系列4-hive

技術標籤:大資料探索hive大資料linuxhadoopkerberos

hadoop生態的kerberos認證系列4-hive

一、準備工作

停掉hadoop叢集;
安裝好kerberos認證服務;

二、hive配置

本文的節點名為node

1.建立主體檔案/憑證

kadmin.local:  addprinc hive/node
kadmin.local:  ktadd -norandkey -k /usr/data//kerberos/keytab/hive.keytab hive/node

最好是按上面的重新建一個hive/node的憑證,並最好匯入到相應的keytab檔案裡,但本文是用以前建的root/node憑證以及以前匯出的keytab檔案(如:/usr/data/kerberos/keytab/root.keytab)

2.修改hive的配置檔案

修改hive-site.xml檔案

<property>
       <name>hive.server2.enable.doAs</name>
       <value>true</value>
 </property>
 <property>
       <
name
>
hive.server2.authentication</name> <value>KERBEROS</value> </property> <property> <name>hive.server2.authentication.kerberos.principal</name> <value>root/[email protected]</value> </property> <property> <
name
>
hive.server2.authentication.kerberos.keytab</name> <value>/usr/data/kerberos/keytab/root.keytab</value> </property> <property> <name>hive.server2.authentication.spnego.keytab</name> <value>/usr/data/kerberos/keytab/root.keytab</value> </property> <property> <name>hive.server2.authentication.spnego.principal</name> <value>root/[email protected]</value> </property> <property> <name>hive.metastore.sasl.enabled</name> <value>true</value> </property> <property> <name>hive.metastore.kerberos.keytab.file</name> <value>/usr/data/kerberos/keytab/root.keytab</value> </property> <property> <name>hive.metastore.kerberos.principal</name> <value>root/[email protected]</value> </property>

3.修改hadoop的core-site.xml配置檔案

若是之前沒有設定的話,這裡再重新設定:
新增如下內容:

<property>
  <name>hadoop.proxyuser.hive.users</name>
  <value>*</value>
</property>
<property>
  <name>hadoop.proxyuser.hive.hosts</name>
  <value>*</value>
</property>

若是添加了內容,則要重啟HADOOP進行生效。
此處我也是用的以前的設定的:

<property>
  <name>hadoop.proxyuser.root.users</name>
  <value>*</value>
</property>
<property>
  <name>hadoop.proxyuser.root.hosts</name>
  <value>*</value>
</property>

三、驗證

1.啟動

#啟動metastore
nohup hive --service metastore 2>/usr/data/hive/log/metastore/metastore.err &
#啟動hiveserver2
nohup hiveserver2 2>/usr/data/hive/log/hiveserver2/hiveserver.err &

#或者用下面命令

#啟動metastore
nohup hive --service metastore >/usr/data/hive/log/metastore/metastore.log &
#啟動hiveserver2
nohup hiveserver2 >/usr/data/hive/log/hiveserver2/hiveserver.log &

若是都能啟動成功,則正常

[[email protected] hadoop]# jps
94561 HMaster #HMaster、hbase
2830 NameNode #nn、hdfs
82193 RunJar #metastore、hive
92176 RunJar #hiveserver2、hive
94711 HRegionServer #HRegionServer、hbase
3352 ResourceManager #yarn
34104 QuorumPeerMain #zk
4314 RunJar
3228 Secur #即代表了datanode、hdfs
2973 SecondaryNameNode #snn、hdfs
3583 NodeManager #yarn
6879 JobHistoryServer #HistoryServer、yarn
106974 Jps

2.驗證

beeline -u "jdbc:hive2://node:10000/default;principal=root/[email protected]"

用hive配置檔案中配置的使用者憑證進行登入,並加上資料庫名(如/default),另外引號不能去掉,不然會報錯:

[[email protected] conf]# beeline -u jdbc:hive2://node:10000/default;principal=root/[email protected]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/apache-hive-2.3.7-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://node:10000/default
20/12/25 14:21:26 [main]: WARN jdbc.HiveConnection: Failed to connect to node:10000
Unknown HS2 problem when communicating with Thrift server.
Error: Could not open client transport with JDBC Uri: jdbc:hive2://node:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN (state=08S01,code=0)
Beeline version 2.3.7 by Apache Hive
beeline> 

登入成功,並能進行正常操作:

[[email protected] conf]# beeline -u "jdbc:hive2://node:10000/default;principal=root/[email protected]"
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/apache-hive-2.3.7-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://node:10000/default;principal=root/[email protected]
Connected to: Apache Hive (version 2.3.7)
Driver: Hive JDBC (version 2.3.7)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.3.7 by Apache Hive
0: jdbc:hive2://node:10000/default> show tables;
+----------------------------------------------------+
|                      tab_name                      |
+----------------------------------------------------+
| kylin_intermediate_capacity_stats_resource_cube_9565437e_c2de_e276_ee4c_4cafd939c159 |
+----------------------------------------------------+
1 row selected (1.813 seconds)
0: jdbc:hive2://node:10000/default>