我创建了一个具有6个节点集群的dataproc集群,当我要安装bdutil时遇到以下问题:
******************* gcloud compute stderr *******************
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
************ ERROR logs from gcloud compute stderr ************
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
******************* Exit codes and VM logs *******************
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-0-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-1-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-2-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-3-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-m-pd --zone=zone(un
set)
答案 0 :(得分:1)
HDP和Dataproc是不同的产品。我的意思是,您无需创建Dataproc集群即可执行bdutil。从一个实例执行它就足够了,因为所有必需的配置都在bdutil_env.sh/ambari.conf中设置。工具bdutil不创建任何Dataproc集群,而是创建自定义vm实例来托管HDP。
有些步骤没有得到很好的记录:
我设置了GOOGLE_APPLICATION_CREDENTIALS变量,权限问题消失了。这很可能是您面临的问题。
1.1如果不起作用,请执行以下命令:gcloud auth activate-service-account --key-file=/PATH/JSON_CREDENTIALS
如果出现其他错误,例如“无效值区域(未设置)”,只需在bdutil_env.sh中设置它们
2.1如果仍然存在相同的错误,请直接转到platform / hdp / ambari.conf更新您的配置。
您将需要设置允许的防火墙规则来访问您的实例,以允许节点之间进行通信,并允许您访问主服务器中的Ambari。
完成上述步骤后,我可以使用Ambari安装HDP。