在E-Mapreduce上未设置获取Job Launcher ClassName错误

时间:2019-06-18 02:56:15

标签: apache-spark alibaba-cloud alibaba-cloud-emapreduce

我正在尝试从阿里巴巴E-MapReduce工作流控制台运行Spark和SparkSQL作业。

我正在尝试运行以下命令:

--master yarn-client --driver-memory 7g --num-executors 10 --executor-memory 5g --executor-cores 1 --jars ossref://emr/checklist/jars/emr-core-0.1.0.jar ossref://emr/checklist/python/wordcount.py oss://emr/checklist/data/kddb 5 32

出现以下错误:

  

2019年6月18日星期二10:48:58 [LocalJobLauncherAM]信息出发   流发射器是...   ================打印运行时环境开始================

     

==系统属性============

     

java.runtime.name = OpenJDK运行时环境
  sun.boot.library.path = / usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / amd64   java.vm.version = 25.151-b12 java.vm.vendor = Oracle Corporation
  java.vendor.url = http://java.oracle.com/ path.separator =:
  java.vm.name = OpenJDK 64位服务器VM文件.encoding.pkg = sun.io
  user.country =美国sun.java.launcher = SUN_STANDARD
  sun.os.patch.level =未知java.vm.specification.name = Java虚拟   机器规格
  user.dir = / mnt / disk1 / flow-agent / local-rm / LocalApplication_1560824463304_2 / container_1560824463304_2_01_000001   java.runtime.version = 1.8.0_151-b12
  java.awt.graphicsenv = sun.awt.X11GraphicsEnvironment
  java.endorsed.dirs = / usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / endorsed   os.arch = amd64 java.io.tmpdir = tmp line.separator =

     

java.vm.specification.vendor = Oracle Corporation os.name = Linux
  sun.jnu.encoding = UTF-8 emr.flow.user = hadoop
  java.library.path = / usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / amd64 / server:/ usr / lib / jvm / java- 1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / amd64:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 /jre/../lib/amd64:/lib64:/usr/lib/hadoop-current/lib/native/::/usr/lib/jvm/java-1.8.0/jre/lib/amd64/server:/ usr / lib / hadoop-current / lib / native:/ usr / java / packages / lib / amd64:/ usr / lib64:/ lib64:/ lib:/ usr / lib   java.specification.name = Java平台API规范
  java.class.version = 52.0 sun.management.compiler =热点64位   分层编译器os.version = 3.10.0-693.2.2.el7.x86_64
  user.home = / home / hadoop user.timezone =亚洲/上海
  java.awt.printerjob = sun.print.PSPrinterJob file.encoding = UTF-8
  java.specification.version = 1.8 flow.job.id = FJI-F6D6115A3E436AAC_0
  java.class.path = launcher.jar user.name = hadoop
  flow.job.result.local.dir = / mnt / disk1 / flow-agent / job-results
  flow.job.launcher.class.name = com.aliyun.emr.flow.agent.jobs.launcher.impl.SparkSqlJobLauncherImpl   java.vm.specification.version = 1.8
  sun.java.command = com.aliyun.emr.flow.agent.jobs.launcher.local.LocalJobLauncherAM   java.home = / usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre   sun.arch.data.model = 64 user.language = zh-CN
  java.specification.vendor = Oracle Corporation
  awt.toolkit = sun.awt.X11.XToolkit java.vm.info =混合模式
  java.version = 1.8.0_151
  java.ext.dirs = / usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / ext:/ usr / java / packages / lib / ext   sun.boot.class.path = / usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / resources.jar:/ usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / rt.jar:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12 .el7_4.x86_64 / jre / lib / sunrsasign.jar:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64/jre/lib/jsse.jar:/ usr /lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64/jre/lib/jce.jar:/usr/lib/jvm/java-1.8.0-openjdk-1.8。 0.151-1.b12.el7_4.x86_64 / jre / lib / charsets.jar:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64/jre/lib/jfr .jar:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64/jre/classes   java.vendor = Oracle Corporation file.separator = /
  java.vendor.url.bug = http://bugreport.sun.com/bugreport/
  flow.job.meta.data.path = job.metadata
  sun.io.unicode.encoding = Unicode小sun.cpu.endian =小
  sun.cpu.isalist =

     

==系统环境==================

     

PATH = / usr / lib / spark-current / bin:/ usr / lib / analytics-zoo / bin:/ usr / lib / anaconda / bin:/ usr / local / bin:/ bin:/ usr / bin :/ usr / local / sbin:/ usr / sbin:/ usr / lib / flow-agent-current / bin:/ usr / lib / hadoop-current / bin:/ usr / lib / hadoop-current / sbin:/ usr /lib/hadoop-current/bin:/usr/lib/hadoop-current/sbin:/usr/lib/hadoop-current/bin:/usr/lib/hadoop-current/sbin:/home/hadoop/.local/ bin:/ home / hadoop / bin   HADOOP_CONF_DIR = / etc / ecm / hadoop-conf HISTCONTROL = ignoredups
  FLOW_AGENT_HOME = / usr / lib / flow-agent-current
  JAVA_LIBRARY_PATH = / usr / lib / hadoop-current / lib / native:/ usr / lib / hadoop-current / lib / native:   HISTSIZE = 1000 JAVA_HOME = / usr / lib / jvm / java-1.8.0
  ZOOCFGDIR = / etc / ecm / zookeeper-conf TERM =未知
  XFILESEARCHPATH = / usr / dt / app-defaults /%L / Dt
  HADOOP_PID_DIR = / usr / lib / hadoop-current / pids LANG = en_US.UTF-8
  XDG_SESSION_ID = c76
  HADOOP_CLASSPATH = / opt / apps / extra-jars / :/ usr / lib / spark-current / yarn / spark-2.3.2-yarn-shuffle.jar:/ opt / apps / extra-jars /   MAIL = / var / spool / mail / hadoop SPARK_HOME = / usr / lib / spark-current
  ANALYTICS_ZOO_HOME = / usr / lib / analytics-zoo
  LD_LIBRARY_PATH = / usr / lib / jvm / java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / amd64 / server:/usr/lib/jvm/java-1.8.0- openjdk-1.8.0.151-1.b12.el7_4.x86_64 / jre / lib / amd64:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el7_4.x86_64/jre/。 ./lib/amd64:/lib64:/usr/lib/hadoop-current/lib/native/::/usr/lib/jvm/java-1.8.0/jre/lib/amd64/server:/usr/lib/ hadoop-current / lib / native   YARN_LOG_DIR = / var / log / hadoop-yarn JVMFLAGS = -verbose:gc   -XX:+ PrintGCDetails -XX:+ PrintGCTimeStamps -XX:+ PrintGCDateStamps -XX:+ UseGCLogFileRotation -XX:NumberOfGCLogFiles = 5 -XX:GCLogFileSize = 128M -Xloggc:/mnt/disk1/log/zookeeper/zookeeper-gc.log- javaagent:/var/lib/ecm-agent/data/jmxetric-1.0.8.jar=host=localhost,port=8649,mode=unicast,wireformat31x=true,process=ZOOKEEPER_ZOOKEEPER,config=/var/lib/ecm- agent / data / jmxetric.xml   HADOOP_HDFS_HOME = / usr / lib / hadoop-current LOGNAME = hadoop
  PWD = / mnt / disk1 / flow-agent / local-rm / LocalApplication_1560824463304_2 / container_1560824463304_2_01_000001   _ = / usr / lib / jvm / java-1.8.0 / bin / java LESSOPEN = || /usr/bin/lesspipe.sh%s SHELL = / bin / bash ANACONDA_HOME = / usr / lib / anaconda
  OLDPWD = / mnt / disk1 / flow-agent / local-rm / LocalApplication_1560824463304_2 / container_1560824463304_2_01_000001   USER = hadoop YARN_PID_DIR = / usr / lib / hadoop-current / pids
  HADOOP_MAPRED_PID_DIR = / usr / lib / hadoop-current / pids
  SPARK_CONF_DIR = / etc / ecm / spark-conf
  HOSTNAME = emr-header-1.cluster-121550
  SPARK_PID_DIR = / usr / lib / spark-current / pids
  ZOO_LOG_DIR = / mnt / disk1 / log / zookeeper
  NLSPATH = / usr / dt / lib / nls / msg /%L /%N.cat
  HADOOP_MAPRED_LOG_DIR = / var / log / hadoop-mapred
  HADOOP_HOME = / usr / lib / hadoop-current
  HADOOP_LOG_DIR = / var / log / hadoop-hdfs
  FLOW_AGENT_CONF_DIR = / etc / ecm / flow-agent-conf HOME = / home / hadoop
  SHLVL = 3 ZOOKEEPER_HOME = / usr / lib / zookeeper-current
  SPARK_LOG_DIR = / mnt / disk1 / log / spark

     

==系统参数=============

     

[]   ================ PRINT RUNTIME ENV END =================== Tue Jun 18 10:48:58 CST 2019 [LocalJobLauncherAM ]错误作业启动器的ClassName为   没有设置,退出。

1 个答案:

答案 0 :(得分:0)

尝试

--class <class_name> --master yarn-client --driver-memory 7g --num-executors 10 --executor-memory 5g --executor-cores 1 --jars ossref://emr/checklist/jars/emr-core-0.1.0.jar ossref://emr/checklist/python/wordcount.py oss://emr/checklist/data/kddb 5 32

查看收到的错误消息,您应该只是缺少类名参数:

  

未设置错误Job Launcher ClassName,退出。

我收到了有关E-MapReduce CreateJob API的示例请求,如下所示:

https://emr.aliyuncs.com/?Action=CreateJob &Name=CreateJobApiTest-SPARK &Type=SPARK &EnvParam=--class%20org.apache.spark.examples.SparkPi%20--master%20yarn-client%20%E2%80%93num-executors%202%20--executor-memory%202g%20--executor-cores%202%20/opt/apps/spark-1.6.0-bin-hadoop2.6/lib/spark-examples*.jar%2010 &FailAct=STOP &amp;RegionId=cn-hangzhou &Common request parameters

希望有帮助。