Kylin 安裝以及spark cube 的建立
阿新 • • 發佈:2019-01-09
二. 環境
1. Hadoop: 2.7+
Hive: 0.13 - 1.2.1+
HBase: 0.98 - 0.99, 1.1+
JDK: 1.7+
2. hdp2.4或hdp2.5
三. 安裝過程
1. 上傳安裝包 /opt/apache-kylin-2.2.0-bin-hbase1x.tar.gz,並解壓
2. 配置KYLIN_HOME環境變數,修改/etc/profile
KYLIN_HOME=/opt/apache-kylin-2.2.0-bin
PATH=KYLIN_HOME/bin
3. run check-env.sh 檢查環境,
4. 執行 kylin.sh start ,沒有error,就可以訪問http://:7070/kylin,初始賬號ADMIN/KYLIN
5. 如果不使用spark引擎,安裝的上一步即可,若配置spark引擎需要進行以下操作
1) 連線關於spark的配置檔案(或拷貝)
mkdir $KYLIN_HOME/hadoop-conf
ln -s /etc/hadoop/conf/core-site.xml $KYLIN_HOME/hadoop-conf/core-site.xml
ln -s /etc/hadoop/conf/hdfs-site.xml $KYLIN_HOME/hadoop-conf/hdfs-site.xml
ln -s /etc/hadoop/conf/yarn-site.xml $KYLIN_HOME/hadoop-conf/yarn-site.xml
ln -s /etc/hbase/2.4.0.0-169/0/hbase-site.xml $KYLIN_HOME/hadoop-conf /hbase-site.xml
cp /etc/hive/2.4.0.0-169/0/hive-site.xml $KYLIN_HOME/hadoop-conf/hive-site.xml
2) 編輯hive-site.xml
vi $KYLIN_HOME/hadoop-conf/hive-site.xml (change "hive.execution.engine" value from "tez" to "mr")
3) 編輯 $KYLIN_HOME/conf/kylin.properties,修改屬性
kylin.env.hadoop-conf-dir=/usr/local/apache-kylin -2.1.0-bin-hbase1x/hadoop-conf
4) spark引擎用的是 kylin 自帶的spark1.6,所以還需要修改一些spark引數,vi $KYLIN_HOME/conf/kylin.properties 下面的引數預設是註釋的,需要取消註釋(特別注意紅色字型部分),
kylin.engine.spark-conf.spark.executor.memry 和kylin.engine.spark-conf.spark.executor.cores 可以根據環境適量調大
kylin.engine.spark-conf.spark.master=yarn
kylin.engine.spark-conf.spark.submit.deployMode=cluster
kylin.engine.spark-conf.spark.yarn.queue=default
kylin.engine.spark-conf.spark.executor.memory=1G
kylin.engine.spark-conf.spark.executor.cores=2
kylin.engine.spark-conf.spark.executor.instances=1
kylin.engine.spark-conf.spark.eventLog.enabled=true
kylin.engine.spark-conf.spark.eventLog.dir=hdfs\:///kylin/spark-history
kylin.engine.spark-conf.spark.history.fs.logDirectory=hdfs\:///kylin/spark-history
#kylin.engine.spark-conf.spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec
## uncomment for HDP
kylin.engine.spark-conf.spark.driver.extraJavaOptions=-Dhdp.version=current
kylin.engine.spark-conf.spark.yarn.am.extraJavaOptions=-Dhdp.version=current
kylin.engine.spark-conf.spark.executor.extraJavaOptions=-Dhdp.version=current