HortonWorks HDP 2.6:通过Ambari安装DRPC服务器问题

时间:2017-05-16 01:21:51

标签: hortonworks-data-platform ambari

我通过Ambari 2.5.0.3在SUSE11sp3上安装HDP 2.6,我的节点遇到故障的最后一步,似乎发生在“DRPC服务器安装”任务中,这里是日志:

stderr:   /var/lib/ambari-agent/data/errors-603.txt

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/STORM/0.9.1/package/scripts/drpc_server.py", line 139, in <module>
    DrpcServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/STORM/0.9.1/package/scripts/drpc_server.py", line 44, in install
    self.configure(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 117, in locking_configure
    original_configure(obj, *args, **kw)
  File "/var/lib/ambari-agent/cache/common-services/STORM/0.9.1/package/scripts/drpc_server.py", line 50, in configure
    storm()
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/STORM/0.9.1/package/scripts/storm.py", line 86, in storm
    content=Template("storm.conf.j2")
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 120, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/etc/security/limits.d/storm.conf'] failed, parent directory /etc/security/limits.d doesn't exist

stdout:   /var/lib/ambari-agent/data/output-603.txt

2017-05-16 02:07:02,497 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-05-16 02:07:02,499 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-05-16 02:07:02,500 - Group['livy'] {}
2017-05-16 02:07:02,503 - Group['spark'] {}
2017-05-16 02:07:02,504 - Group['zeppelin'] {}
2017-05-16 02:07:02,505 - Group['hadoop'] {}
2017-05-16 02:07:02,506 - Group['users'] {}
2017-05-16 02:07:02,507 - Group['knox'] {}
2017-05-16 02:07:02,509 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,512 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,514 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,516 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,517 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,519 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 02:07:02,521 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,523 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 02:07:02,525 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 02:07:02,527 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop']}
2017-05-16 02:07:02,529 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,531 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,533 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,535 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,537 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 02:07:02,538 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,540 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,542 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:02,543 - Modifying user hdfs
2017-05-16 02:07:03,568 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:03,571 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:03,573 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:03,574 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:03,576 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:03,578 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 02:07:03,581 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-05-16 02:07:03,584 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-05-16 02:07:03,595 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-05-16 02:07:03,595 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-05-16 02:07:03,597 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-05-16 02:07:03,599 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-05-16 02:07:03,609 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-05-16 02:07:03,610 - Group['hdfs'] {}
2017-05-16 02:07:03,611 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs', 'hdfs']}
2017-05-16 02:07:03,613 - Modifying user hdfs
2017-05-16 02:07:03,664 - FS Type: 
2017-05-16 02:07:03,664 - Directory['/etc/hadoop'] {'mode': 0755}
2017-05-16 02:07:03,689 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-05-16 02:07:03,690 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2017-05-16 02:07:03,690 - Changing owner for /usr/hdp/current/hadoop-client/conf/hadoop-env.sh from 0 to hdfs
2017-05-16 02:07:03,691 - Changing group for /usr/hdp/current/hadoop-client/conf/hadoop-env.sh from 0 to hadoop
2017-05-16 02:07:03,692 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-05-16 02:07:03,718 - Initializing 2 repositories
2017-05-16 02:07:03,719 - Repository['HDP-2.6'] {'base_url': 'http://192.168.156.25/hdp/HDP/suse11sp3/2.x/updates/2.6.0.3', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-05-16 02:07:03,734 - Flushing package manager cache since repo file content is about to change
2017-05-16 02:07:03,734 - checked_call[['zypper', 'clean', '--all']] {'sudo': True}
2017-05-16 02:07:03,837 - checked_call returned (0, 'All repositories have been cleaned up.')
2017-05-16 02:07:03,837 - File['/etc/zypp/repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://192.168.156.25/hdp/HDP/suse11sp3/2.x/updates/2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-05-16 02:07:03,839 - Writing File['/etc/zypp/repos.d/HDP.repo'] because contents don't match
2017-05-16 02:07:03,840 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://192.168.156.25/hdp/HDP-UTILS-1.1.0.21/repos/suse11sp3', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-05-16 02:07:03,846 - Flushing package manager cache since repo file content is about to change
2017-05-16 02:07:03,846 - checked_call[['zypper', 'clean', '--all']] {'sudo': True}
2017-05-16 02:07:04,410 - checked_call returned (0, 'All repositories have been cleaned up.')
2017-05-16 02:07:04,411 - File['/etc/zypp/repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://192.168.156.25/hdp/HDP-UTILS-1.1.0.21/repos/suse11sp3\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-05-16 02:07:04,411 - Writing File['/etc/zypp/repos.d/HDP-UTILS.repo'] because contents don't match
2017-05-16 02:07:04,412 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 02:07:04,884 - Skipping installation of existing package unzip
2017-05-16 02:07:04,884 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 02:07:05,170 - Skipping installation of existing package curl
2017-05-16 02:07:05,171 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 02:07:05,391 - Skipping installation of existing package hdp-select
2017-05-16 02:07:05,594 - Package['storm_2_6_0_3_8'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 02:07:06,118 - Installing package storm_2_6_0_3_8 ('/usr/bin/zypper --quiet install --auto-agree-with-licenses --no-confirm storm_2_6_0_3_8')
2017-05-16 02:07:36,817 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-05-16 02:07:36,820 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-05-16 02:07:36,822 - Directory['/var/log/storm'] {'owner': 'storm', 'group': 'hadoop', 'create_parents': True, 'mode': 0777, 'cd_access': 'a'}
2017-05-16 02:07:36,823 - Changing group for /var/log/storm from 117 to hadoop
2017-05-16 02:07:36,824 - Changing permission for /var/log/storm from 755 to 777
2017-05-16 02:07:36,825 - Directory['/var/run/storm'] {'owner': 'storm', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2017-05-16 02:07:36,826 - Changing group for /var/run/storm from 117 to hadoop
2017-05-16 02:07:36,827 - Directory['/hadoop/storm'] {'owner': 'storm', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2017-05-16 02:07:36,827 - Creating directory Directory['/hadoop/storm'] since it doesn't exist.
2017-05-16 02:07:36,827 - Changing owner for /hadoop/storm from 0 to storm
2017-05-16 02:07:36,828 - Changing group for /hadoop/storm from 0 to hadoop
2017-05-16 02:07:36,828 - Directory['/usr/hdp/current/storm-client/conf'] {'group': 'hadoop', 'create_parents': True, 'cd_access': 'a'}
2017-05-16 02:07:36,828 - Changing group for /usr/hdp/current/storm-client/conf from 0 to hadoop
2017-05-16 02:07:36,834 - File['/etc/security/limits.d/storm.conf'] {'content': Template('storm.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}

Command failed after 1 tries

我在https://issues.apache.org/jira/browse/AMBARI-7792中发现了类似的问题 ,但它已在Ambari 1.7中得到解决。

1 个答案:

答案 0 :(得分:0)

在使用Ambari 2.5.1部署HDF 3.0时,我使用Storm Nimbus在Suse 12上遇到了同样的错误。

我把它作为一个新的apache jira https://issues.apache.org/jira/browse/AMBARI-21489提出来,因为AMBARI-7792应该被修复回Ambari 1.7。

显而易见的快速解决方法只适用于mkdir /etc/security/limits.d