从IntelliJ IDEA在YARN上运行Apache Spark程序

时间:2014-12-02 21:23:17

标签: intellij-idea apache-spark

我已将Apache Spark 1.1.1设置为在YARN(Hadoop-2.5.2)上运行。我可以使用spark-submit命令运行程序。

我正在使用IntelliJ IDEA 14.我能够使用spark-submit构建工件并运行生成的jar。

但是,我想知道是否可以直接从IntelliJ运行整个程序?

我添加了必要的库并激活了hadoop-2.4配置文件。但是,我最终得到以下错误

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.UserGroupInformation.getCredentials()Lorg/apache/hadoop/security/Credentials;
at org.apache.spark.deploy.yarn.ClientBase$class.$init$(ClientBase.scala:58)
at org.apache.spark.deploy.yarn.Client.<init>(Client.scala:37)
at org.apache.spark.deploy.yarn.Client.<init>(Client.scala:43)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:91)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:333)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
at WordCountWorkFlow.main(WordCountWorkFlow.java:24)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)

有人可以告诉我哪里出错了吗?

1 个答案:

答案 0 :(得分:1)

在Intellij中,您必须添加依赖项,即Hadoop conf dir的路径

转到项目设置并在依赖项中添加路径 $ HADOOP_HOME的/ etc / hadoop的

如果您正在使用任何lambda,那么从项目设置 - &gt; sources - &gt;语言水平集8-lambda type annonation enter image description here