Spark-Yarn错误的FS

时间:2017-11-30 10:20:52

标签: apache-spark yarn

我有一个有效的火花工作,但当我尝试通过Yarn执行时,有一个例外,我不知道如何解决它需要进行更新的地方。错误如下:

017-11-30 10:28:49,952 [main] INFO  org.apache.spark.deploy.yarn.Client  - Source and destination file systems are the same. Not copying hdfs://servername1.domain.net:8020/user/oozie/share/lib/lib_20171123121217/spark2/spark-yarn_2.11-2.2.0.cloudera1.jar
2017-11-30 10:28:49,972 [main] INFO  org.apache.spark.deploy.yarn.Client  - Deleted staging directory hdfs://servername1:8020/user/pe3016/.sparkStaging/application_1511521415490_0216
2017-11-30 10:28:49,974 [main] ERROR org.apache.spark.SparkContext  - Error initializing SparkContext.
java.lang.IllegalArgumentException: Wrong FS: hdfs://servername.domain.net:8020/user/oozie/share/lib/lib_20171123121217/spark2/spark-yarn_2.11-2.2.0.cloudera1.jar, expected: hdfs://servername:8020

我正在使用cloudera 5.12.1

0 个答案:

没有答案