1. 程式人生 > >Ambari2.7HDP3.0安裝

Ambari2.7HDP3.0安裝

1、安裝過程還算順利,但是中間出了點問題,主機名要小寫,資料庫連線用ip地址,不用主機名

在安裝服務的時候報錯

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stack-hooks/before-INSTALL/scripts/hook.py", line 37, in 
    BeforeInstallHook().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 222, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select
 stdout:
2018-07-15 13:37:13,688 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-07-15 13:37:13,694 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-07-15 13:37:13,696 - Group['kms'] {}
2018-07-15 13:37:13,697 - Group['livy'] {}
2018-07-15 13:37:13,697 - Group['spark'] {}
2018-07-15 13:37:13,697 - Group['ranger'] {}
2018-07-15 13:37:13,697 - Group['hdfs'] {}
2018-07-15 13:37:13,697 - Group['zeppelin'] {}
2018-07-15 13:37:13,698 - Group['hadoop'] {}
2018-07-15 13:37:13,698 - Group['users'] {}
2018-07-15 13:37:13,698 - Group['knox'] {}
2018-07-15 13:37:13,711 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,721 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,722 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,724 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,725 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-07-15 13:37:13,726 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,727 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,728 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2018-07-15 13:37:13,730 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-07-15 13:37:13,731 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-07-15 13:37:13,733 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2018-07-15 13:37:13,734 - User['logsearch'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,735 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2018-07-15 13:37:13,737 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2018-07-15 13:37:13,738 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-07-15 13:37:13,739 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,740 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-07-15 13:37:13,744 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,746 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,747 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,748 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2018-07-15 13:37:13,749 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-07-15 13:37:13,751 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-07-15 13:37:13,756 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-07-15 13:37:13,756 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-07-15 13:37:13,757 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-07-15 13:37:13,759 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-07-15 13:37:13,760 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-07-15 13:37:13,769 - call returned (0, '1028')
2018-07-15 13:37:13,770 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1028'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-07-15 13:37:13,774 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1028'] due to not_if
2018-07-15 13:37:13,774 - Group['hdfs'] {}
2018-07-15 13:37:13,775 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-07-15 13:37:13,775 - FS Type: HDFS
2018-07-15 13:37:13,775 - Directory['/etc/hadoop'] {'mode': 0755}
2018-07-15 13:37:13,775 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-07-15 13:37:13,788 - Repository['HDP-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://192.168.32.139/ambari2.7hdp3.0/hdp/centos7/3.0.0.0-1634/', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-07-15 13:37:13,794 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://192.168.32.139/ambari2.7hdp3.0/hdp/centos7/3.0.0.0-1634/\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-07-15 13:37:13,800 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-07-15 13:37:13,827 - Repository['HDP-3.0-GPL-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-07-15 13:37:13,833 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://192.168.32.139/ambari2.7hdp3.0/hdp/centos7/3.0.0.0-1634/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-07-15 13:37:13,833 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-07-15 13:37:13,845 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://192.168.32.139/ambari2.7hdp3.0/HDP-UTILS/centos7/1.1.0.22/', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-07-15 13:37:13,848 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://192.168.32.139/ambari2.7hdp3.0/hdp/centos7/3.0.0.0-1634/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://192.168.32.139/ambari2.7hdp3.0/HDP-UTILS/centos7/1.1.0.22/\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-07-15 13:37:13,848 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-07-15 13:37:13,851 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-07-15 13:37:13,920 - Skipping installation of existing package unzip
2018-07-15 13:37:13,920 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-07-15 13:37:13,943 - Skipping installation of existing package curl
2018-07-15 13:37:13,943 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-07-15 13:37:13,954 - Skipping installation of existing package hdp-select

Command failed after 1 tries

想到之前曾經安裝了2.6.5版本

問題原因:可能是由於之前安裝過,但沒有安裝成功,導致在檔案系統中存在一些檔案,致使後續重試的安裝出現問題。 

解決方法:輸入以下命令

 yum -y erase hdp-select