1. 程式人生 > >Hive JDBC Java示例專案

Hive JDBC Java示例專案

一、搭建Hive JDBC開發環境

首先要啟動hive監聽程式

hive --service hiveserver &

預設埠10000

建立Maven工程

專案結構:
這裡寫圖片描述

  • 新建TestHiveJdbc工程
  • 建立pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>4.0.0</modelVersion> <groupId>com.cn.whr.HiveTestJdbc</groupId> <artifactId>com.cn.whr.HiveTestJdbc</artifactId> <version>0.0.1-SNAPSHOT</version> <packaging>jar</packaging> <name>com.cn.whr.HiveTestJdbc</name
>
<url>http://maven.apache.org</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> <repositories> <repository> <id>spring-milestones</id> <url
>
http://repo.spring.io/libs-milestone/</url> </repository> </repositories> <dependencies> <!-- https://mvnrepository.com/artifact/asm/asm --> <dependency> <groupId>javax.jdo</groupId> <artifactId>jdo2-api</artifactId> <version>2.3-ec</version> </dependency> <dependency> <groupId>asm</groupId> <artifactId>asm</artifactId> <version>3.3.1</version> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency> <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-jdbc</artifactId> <version>2.1.1</version> <classifier>standalone</classifier> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> <exclusion> <artifactId>log4j-over-slf4j</artifactId> <groupId>org.slf4j</groupId> </exclusion> </exclusions> </dependency> </dependencies> </project>

Log4j2.xml

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="warn">
    <Appenders>
        <Console name="Console" target="SYSTEM_OUT">
            <PatternLayout pattern="[%-5p] %d %c - %m%n" />
        </Console>
        <File name="File" fileName="dist/my.log">
            <PatternLayout pattern="%m%n" />
        </File>
    </Appenders>

    <Loggers>
        <Logger name="mh.sample2.Log4jTest2" level="INFO">
            <AppenderRef ref="File" />
        </Logger>
        <Root level="INFO">
            <AppenderRef ref="Console" />
        </Root>
    </Loggers>
</Configuration>

App.java

package com.cn.whr.HiveTestJdbc;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;

/**
 * test
 *
 */
public class App 
{
    private static String driverName = "org.apache.hive.jdbc.HiveDriver";  
    public static void main( String[] args )
    {
          try {  
              Class.forName(driverName);  
              Connection con = null;  
              con = DriverManager.getConnection("jdbc:hive2://192.10.200.81:10000/comm_data", "hive", "");  
              Statement stmt = con.createStatement();  
              ResultSet res = null;  
              String sql = "select * from student";  
              System.out.println("Running: " + sql);  
              res = stmt.executeQuery(sql);  
              System.out.println("ok");  
              while (res.next()) {  
                  System.out.println(res.getString(1) + "\t" + res.getString(2) + "\t" +  res.getString(3) + "\t" +  res.getString(4));  
              }  
          } catch (Exception e) {  
              e.printStackTrace();  
              System.out.println("error");  
          }  
    }
}

這裡寫圖片描述

其中SLF4J衝突還沒解決掉。以後找到解決方法再記錄。

二、遇到問題

1.專案執行時,如果執行Update maven,可能會提示找不到Log4j2.xml
需要點一次Project-Clean。

2.FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. User: root is not allowed to impersonate hive
修改core-site.xml

<property>
    <name>hadoop.proxyuser.root.hosts</name>
    <value>*</value>
</property>
<property>
    <name>hadoop.proxyuser.root.groups</name>
    <value>*</value>
</property>
lsof -i tcp:8088

找到對應的程序,並通過

kill -9 程序號

殺死程序,再次啟動叢集,問題解決。

3.java.lang.OutOfMemoryError: PermGen space
修改hadoop-env.sh

export HADOOP_CLIENT_OPTS="-Xmx2048m $HADOOP_CLIENT_OPTS"

加大其中的Xmx值。
或者修改mapred.xml


<property> 
    <name>mapred.child.java.opts</name> 
    <value>-Xmx1024m</value> 
    <final>true</final> 
</property>