向emr提交本地Spark工作

时间:2019-02-08 19:22:31

标签: amazon-web-services apache-spark hadoop amazon-emr

im在将doc作业提交给emr集群的亚马逊文档之后 https://aws.amazon.com/premiumsupport/knowledge-center/emr-submit-spark-job-remote-cluster/

按照说明进行操作后,由于无法解析的地址出现了类似的消息,因此无法进行频繁的故障排除。

  

错误spark.SparkContext:初始化SparkContext时出错。   java.lang.IllegalArgumentException:java.net.UnknownHostException:   ip-172-32-1-231.us-east-2.compute.internal at   org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)     在   org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)     在   org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)

当我看到它要解析的IP是主节点时,我用 sed 将其更改为配置文件(从/ etc / hadoop获得的IP)中的公共节点。主节点中的/ conf目录)。但是错误是连接到数据节点

  

INFO hdfs.DFSClient:createBlockOutputStream中的异常   org.apache.hadoop.net.ConnectTimeoutException:等待通道准备好连接时60000毫秒超时。 ch:java.nio.channels.SocketChannel [正在等待连接的远程= / 172.32.1.41:50010]       在org.apache.hadoop.net.NetUtils.connect(NetUtils.java:533)       在org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606)       在org.apache.hadoop.hdfs.DFSOutputStream $ DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404)       在org.apache.hadoop.hdfs.DFSOutputStream $ DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)       在org.apache.hadoop.hdfs.DFSOutputStream $ DataStreamer.run(DFSOutputStream.java:587)   19/02/08 13:54:58 INFO hdfs.DFSClient:放弃BP-1960505320-172.32.1.231-1549632479324:blk_1073741907_1086

最后我尝试了与这个问题相同的解决方案= Spark HDFS Exception in createBlockOutputStream while uploading resource file

将以下内容添加到hdfs-site.xml文件中:

<property>
  <name>dfs.client.use.datanode.hostname</name>
  <value>true</value>
</property>   

但该错误仍作为未解决的地址异常而保留

19/02/08 13:58:06 WARN hdfs.DFSClient: DataStreamer Exception
java.nio.channels.UnresolvedAddressException
    at sun.nio.ch.Net.checkAddress(Net.java:101)
    at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)

有人可以帮我在本地计算机上设置Spark以便将火花提交给远程EMR吗?

1 个答案:

答案 0 :(得分:1)

除了按照链接的问题给出答案外,还应将工作节点的(公共)IP和(私有)DNS添加到您的 / etc / hosts 文件中。