Python脚本超时错误Ambari

时间:2015-04-16 17:11:35

标签: java python hadoop ambari

我在安装,开始和测试阶段Python script has been killed due to timeout after waiting 900 secs

期间遇到了一些错误

在此附加日志

stderr: 
Python script has been killed due to timeout after waiting 900 secs
 stdout:
2015-04-16 16:43:04,609 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://vagrant-centos65.vagrantup.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-04-16 16:43:04,683 - Group['hadoop'] {'ignore_failures': False}
2015-04-16 16:43:04,685 - Adding group Group['hadoop']
2015-04-16 16:43:04,755 - Group['nobody'] {'ignore_failures': False}
2015-04-16 16:43:04,756 - Modifying group nobody
2015-04-16 16:43:04,816 - Group['users'] {'ignore_failures': False}
2015-04-16 16:43:04,817 - Modifying group users
2015-04-16 16:43:04,853 - Group['nagios'] {'ignore_failures': False}
2015-04-16 16:43:04,854 - Adding group Group['nagios']
2015-04-16 16:43:04,896 - Group['knox'] {'ignore_failures': False}
2015-04-16 16:43:04,896 - Adding group Group['knox']
2015-04-16 16:43:04,950 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
2015-04-16 16:43:04,951 - Modifying user nobody
2015-04-16 16:43:05,025 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,026 - Adding user User['hive']
2015-04-16 16:43:05,131 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-16 16:43:05,131 - Adding user User['oozie']
2015-04-16 16:43:05,199 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,200 - Adding user User['nagios']
2015-04-16 16:43:05,384 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-16 16:43:05,385 - Adding user User['ambari-qa']
2015-04-16 16:43:05,504 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,505 - Adding user User['flume']
2015-04-16 16:43:05,612 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,613 - Adding user User['hdfs']
2015-04-16 16:43:05,682 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,683 - Adding user User['knox']
2015-04-16 16:43:05,738 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,738 - Adding user User['storm']
2015-04-16 16:43:05,806 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,806 - Adding user User['mapred']
2015-04-16 16:43:05,859 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,860 - Adding user User['hbase']
2015-04-16 16:43:05,916 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-16 16:43:05,917 - Adding user User['tez']
2015-04-16 16:43:05,972 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:05,972 - Adding user User['zookeeper']
2015-04-16 16:43:06,041 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,042 - Adding user User['kafka']
2015-04-16 16:43:06,097 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,098 - Adding user User['falcon']
2015-04-16 16:43:06,161 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,162 - Adding user User['sqoop']
2015-04-16 16:43:06,228 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,229 - Adding user User['yarn']
2015-04-16 16:43:06,287 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-16 16:43:06,287 - Adding user User['hcat']
2015-04-16 16:43:06,345 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-16 16:43:06,346 - Writing File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] because it doesn't exist
2015-04-16 16:43:06,346 - Changing permission for /var/lib/ambari-agent/data/tmp/changeUid.sh from 644 to 555
2015-04-16 16:43:06,347 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-04-16 16:43:06,469 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-16 16:43:06,470 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
2015-04-16 16:43:06,599 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-04-16 16:43:06,599 - Creating directory Directory['/etc/hadoop/conf.empty']
2015-04-16 16:43:06,600 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-04-16 16:43:06,623 - Creating symbolic Link['/etc/hadoop/conf']
2015-04-16 16:43:06,651 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
2015-04-16 16:43:06,651 - Writing File['/etc/hadoop/conf/hadoop-env.sh'] because it doesn't exist
2015-04-16 16:43:06,652 - Changing owner for /etc/hadoop/conf/hadoop-env.sh from 0 to hdfs
2015-04-16 16:43:06,684 - Repository['HDP-2.2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.2.4.2', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
2015-04-16 16:43:06,710 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-16 16:43:06,711 - Writing File['/etc/yum.repos.d/HDP.repo'] because it doesn't exist
2015-04-16 16:43:06,711 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2015-04-16 16:43:06,722 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-16 16:43:06,722 - Writing File['/etc/yum.repos.d/HDP-UTILS.repo'] because it doesn't exist
2015-04-16 16:43:06,722 - Package['unzip'] {}
2015-04-16 16:43:07,144 - Skipping installing existent package unzip
2015-04-16 16:43:07,145 - Package['curl'] {}
2015-04-16 16:43:07,463 - Skipping installing existent package curl
2015-04-16 16:43:07,464 - Package['hdp-select'] {}
2015-04-16 16:43:07,724 - Installing package hdp-select ('/usr/bin/yum -d 0 -e 0 -y install hdp-select')
2015-04-16 16:43:11,315 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ;   curl -kf -x ""   --retry 10 http://vagrant-centos65.vagrantup.com:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
2015-04-16 16:43:11,323 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ;   curl -kf -x ""   --retry 10 http://vagrant-centos65.vagrantup.com:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] due to not_if
2015-04-16 16:43:11,324 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
2015-04-16 16:43:11,333 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if
2015-04-16 16:43:11,564 - Package['hadoop_2_2_*-yarn'] {}
2015-04-16 16:43:11,825 - Installing package hadoop_2_2_*-yarn ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_2_*-yarn')

林'使用ambari 1.7并一直关注this安装指南。

任何帮助将不胜感激。 感谢

3 个答案:

答案 0 :(得分:1)

使用Ambari 2.2(HDP2.3.4.0),我也遇到了同样的问题:

Python脚本在等待1800秒后因超时而被杀死

可以通过在/etc/ambari-server/conf/ambari.properties中设置超时(agent.package.install.task.timeout = 1800)来解决

答案 1 :(得分:0)

我已经解决了这个问题,

下载所有这些软件包的服务器存在问题。 当这些服务器的响应占用太多时间会导致此错误。即使这个错误不是问题制造者。您可以尝试使用其他镜像来安装软件包,也可以手动安装在主机中。之后ambari将检查其是否正确安装。如果发现安装正确,它将自动尝试安装队列中的下一个包。

答案 2 :(得分:0)

我通过添加

解决了该问题

[安全] force_https_protocol = PROTOCOL_TLSv1_2

到/etc/ambari-agent/conf/ambari-agent.ini文件,它工作正常。