1. 程式人生 > >spark 2.1.1訪問hive2.3.0

spark 2.1.1訪問hive2.3.0

新增jdbc依賴

 <dependency>
        <groupId>mysql</groupId>
        <artifactId>mysql-connector-java</artifactId>
        <version>5.1.43</version>
    </dependency>

新增spark-hive支援依賴

    <dependency>
        <groupId>org.apache.spark</groupId
>
<artifactId>spark-hive_2.11</artifactId> <version>2.1.1</version> </dependency>

拷貝hive-site,hdfs-site,core-site到resources中

執行spark程式

 val spark = SparkSession
      .builder()
      .master("local[*]")
      .appName("hive")
      .enableHiveSupport()
      .getOrCreate()

    import spark.sql

        sql("CREATE
TABLE IF NOT EXISTS src (ci STRING)") sql("LOAD DATA LOCAL INPATH 'src/main/resources/1.txt' INTO TABLE src") sql("SELECT * FROM src").show() sql("SELECT COUNT(*) FROM src").show() sql("DROP TABLE IF EXISTS LOC") df.write.saveAsTable("LOC")

若建表失敗,提示許可權不足,無法向HDFS中寫入,修改HDFS上warehouse的許可權即可

hadoop fs -chmod 777 /user/hive/warehouse