在Hortonworks集群上提交Spark作业时出错

时间:2018-10-22 15:38:56

标签: java scala apache-spark hadoop

运行Spark作业时出现错误,这似乎是jar文件中的不兼容问题。请协助

命令

ata1/spark/spark-1.6.3/spark-1.6.3-bin-hadoop2.6/bin/spark-submit --queue dwh --class class_name  --master yarn-cluster  --num-executors 20 --driver-memory 1g --driver-java-options "-XX:MaxPermSize=1G" --executor-memory 4g --executor-cores 1 jarpath/jarname

以下为错误

Warning: Ignoring non-spark config property: hive.metastore.uris=thrift://ip-xxx-xx-xx-xxx.ap-xxxxx-1.compute.internal:xxxx
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data1/spark/spark-1.6.3/spark-1.6.3-bin-hadoop2.6/lib/spark-assembly-1.6.3-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.4.0-91/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/10/22 15:30:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/10/22 15:30:23 INFO impl.TimelineClientImpl: Timeline service address: http://ip-xxx-xx-xx-xxx.ap-xxxxx-1.compute.internal:xxxx/ws/v1/timeline/
Exception in thread "main" java.lang.IllegalAccessError: tried to access method org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider.getProxyInternal()Ljava/lang/Object; from class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider
    at org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider.init(RequestHedgingRMFailoverProxyProvider.java:75)
    at org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:163)
    at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:93)
    at org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72)
    at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:174)
    at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:126)
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:1021)
    at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
    at org.apache.spark.deploy.yarn.Client.main(Client.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

0 个答案:

没有答案