Spark无法使用java.lang.NumberFormatException将作业提交到远程纱线群集

时间:2018-01-24 02:49:39

标签: hadoop apache-spark pyspark

我是Spark和Hadoop的新手。我在Windows计算机上配置了单节点Hadoop集群,并尝试在Mac笔记本电脑上向其提交作业。

我使用的命令是

    ./pyspark --master yarn --deploy-mode client

和我的yarn.site.xml文件如下:

 <configuration>
   <property>
   <name>yarn.nodemanager.aux-services</name>
   <value>mapreduce_shuffle</value>
 </property>

 <property>
   <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
  <value>org.apache.hadoop.mapred.ShuffleHandler</value>
 </property>

  <property>
  <name>yarn.resourcemanager.address</name>
  <value>192.168.0.11:8032</value>
  </property>

<property>
 <name>yarn.resourcemanager.resource-tracker.address</name>
 <value>192.168.0.11:8025</value>
</property>

 <property>
  <name>yarn.resourcemanager.scheduler.address</name>
  <value>192.168.0.11:8030</value>
 </property>


   <property>
   <name>yarn.nodemanager.vmem-check-enabled</name>
   <value>false</value>
   </property>


   </configuration>

此文件在群集上与在笔记本电脑上相同。纱线日志中的错误消息如下所示

java.lang.NumberFormatException: For input string: "49187'" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Integer.parseInt(Integer.java:580) at java.lang.Integer.parseInt(Integer.java:615) at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:272) at scala.collection.immutable.StringOps.toInt(StringOps.scala:29) at org.apache.spark.util.Utils$.parseHostPort(Utils.scala:980) at org.apache.spark.deploy.yarn.ApplicationMaster.waitForSparkDriver(ApplicationMaster.scala:555) at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:433) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:256) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:764) at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:67) at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:66) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:762) at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:785) at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala) 18/01/23 21:24:51 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: java.lang.NumberFormatException: For input string: "49187'")

我读了一堆纱线资源管理员的stderr消息并找到了

INFO amlauncher.AMLauncher: Command to launch container container_1516760256160_0002_02_000001 : {{JAVA_HOME}}/bin/java,-server,-Xmx512m,-Djava.io.tmpdir={{PWD}}/tmp,-Dspark.yarn.app.container.log.dir=<LOG_DIR>,org.apache.spark.deploy.yarn.ExecutorLauncher,--arg,'192.168.0.12:49187',--properties-file,{{PWD}}/__spark_conf__/__spark_conf__.properties,1>,<LOG_DIR>/stdout,2>,<LOG_DIR>/stderr

显示数字49187。但是,我不知道上面命令的调用位置。有人可以指出我如何解决这个问题吗?

0 个答案:

没有答案
相关问题