运行spark wordcount示例时出现IllegalStateException

时间:2016-11-25 09:37:48

标签: hadoop apache-spark

Spark版本1.6.2 Hadoop版本2.7.3

在独立群集模式下运行spark时

命令wordcount示例:

spark-submit --class org.apache.spark.examples.JavaWordCount --master spark://IP:7077 spark-examples-1.6.2.2.5.0.0-1245-hadoop2.7.3.2.5.0.0-1245.jar file.txt output

获得以下错误

INFO cluster.SparkDeploySchedulerBackend: Executor app-20161125052710-0012/10 removed: java.io.IOException: Failed to create directory /usr/hdp/2.5.0.0-1245/spark/work/app-20161125052710-0012/10    
ERROR spark.SparkContext: Error initializing SparkContext.
    java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
    This stopped SparkContext was created at:

    org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
    org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:44)
    sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    java.lang.reflect.Method.invoke(Method.java:606)
    org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    The currently active SparkContext was created at:

    (No active SparkContext.)

        at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:106)
        at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1602)
        at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2203)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
        at org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:44)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    16/11/25 04:24:48 INFO spark.SparkContext: SparkContext already stopped.

在spark主节点url中,我看到两个处于ALIVE状态的工作人员

1 个答案:

答案 0 :(得分:0)

似乎Failed to create directory /usr/hdp/2.5.0.0-1245/spark/work是根本原因。在获得/usr/hdp/2.5.0.0-1245/spark/work路径的许可后,它运行良好