1. 程式人生 > 實用技巧 >ambari安裝kylin成功但無法啟動報錯及處理

ambari安裝kylin成功但無法啟動報錯及處理

目錄

ambari安裝kylin成功但無法啟動報錯及處理

1、cannot stat ‘/usr/hdp/3.0.1.0-187/kylin/pid’: No such file or directory

從ambari啟動kylin時報錯大意如下:

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/KYLIN/package/scripts/kylin_query.py", line 74, in <module>
    KylinQuery().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/KYLIN/package/scripts/kylin_query.py", line 53, in start
    Execute(cmd, user='hdfs')
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
    returns=self.resource.returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '. /var/lib/ambari-agent/tmp/kylin_env.rc;/usr/hdp/3.0.1.0-187/kylin/bin/kylin.sh start;cp -rf /usr/hdp/3.0.1.0-187/kylin/pid /var/run/kylin/kylin.pid' returned 1. Retrieving hadoop conf dir...
...................................................[PASS]
KYLIN_HOME is set to /usr/hdp/3.0.1.0-187/kylin
Checking HBase
...................................................[PASS]
Checking hive
...................................................[PASS]
Checking hadoop shell
...................................................[PASS]
Checking hdfs working dir
...................................................[PASS]
Retrieving Spark dependency...
...................................................[PASS]
Retrieving Flink dependency...
Optional dependency flink not found, if you need this; set FLINK_HOME, or run bin/download-flink.sh
...................................................[PASS]
Retrieving kafka dependency...
Couldn't find kafka home. If you want to enable streaming processing, Please set KAFKA_HOME to the path which contains kafka dependencies.
...................................................[PASS]

Checking environment finished successfully. To check again, run 'bin/check-env.sh' manually.
Retrieving hive dependency...
Something wrong with Hive CLI or Beeline, please execute Hive CLI or Beeline CLI in terminal to find the root cause.
cp: cannot stat ‘/usr/hdp/3.0.1.0-187/kylin/pid’: No such file or directory

這有可能是沒有許可權不能成功啟動kylin導致,試著授權hdfs使用者,讓其擁有hive的許可權

[root@worker kylin]# su hdfs
[hdfs@worker kylin]$ /usr/hdp/3.0.1.0-187/kylin/bin/sample.sh

如果執行上面的命令再次報錯Cannot modify hive.security.authorization.sqlstd.confwhitelist.append at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)


則應在ambari的services選單中找到Hive,在右邊CONFIGS -> Filter中輸入hive.security,把
Enable Authorization後面的勾勾掉,如下圖:

,點儲存後再次在hdfs使用者下重新執行
/usr/hdp/3.0.1.0-187/kylin/bin/sample.sh應該可以得到解決。

2、Cannot modify hive.security.authorization.sqlstd.confwhitelist.append at runtime

這個錯誤其實是上一個錯誤hdfs使用者執行/usr/hdp/3.0.1.0-187/kylin/bin/sample.sh

語句出現的錯誤,這裡單獨列出來,方便查詢。
解決方法:
在ambari的services選單中找到Hive,在右邊CONFIGS -> Filter中輸入hive.security,把
Enable Authorization後面的勾勾掉,如下圖:

,點儲存後再次在hdfs使用者下重新執行
/usr/hdp/3.0.1.0-187/kylin/bin/sample.sh應該可以得到解決。