Eclipse无法运行Spark Java项目

时间:2019-04-25 02:04:44

标签: java eclipse scala apache-spark

我有一个要在本地PC上运行的Maven Spark Java项目,但遇到一些与Scala相关的错误。

请参见下面的控制台输出。不知道这里到底有什么问题。

log4j:WARN No appenders could be found for logger (com.spark.craft.demo.SparkApplication).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.VerifyError: Uninitialized object exists on backward branch 209
Exception Details:
  Location:
    scala/collection/immutable/HashMap$HashTrieMap.split()Lscala/collection/immutable/Seq; @249: goto
  Reason:
    Error exists in the bytecode
org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
    at com.spark.craft.demo.SparkApplication.start(SparkApplication.java:34)
    at com.spark.craft.demo.Application.main(Application.java:51)

0 个答案:

没有答案