乔布斯引发了失败

时间:2017-08-23 09:41:13

标签: hadoop apache-spark yarn

当我想在R上启动spark作业时,我收到此错误:

Erreur : java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82) ....

在火花日志中(/ opt / mapr / spark / spark-version / logs)我发现了很多这些例外:

ERROR FsHistoryProvider: Exception encountered when attempting to load application log maprfs:///apps/spark/.60135a9b-ec7c-4f71-8f92-4d4d2fbb1e2b
java.io.FileNotFoundException: File maprfs:///apps/spark/.60135a9b-ec7c-4f71-8f92-4d4d2fbb1e2b does not exist.

我知道如何解决这个问题?

1 个答案:

答案 0 :(得分:1)

您需要创建sparkContext(如果存在则获取)

import org.apache.spark.{SparkConf, SparkContext}

// 1. Create Spark configuration
val conf = new SparkConf()
  .setAppName("SparkMe Application")
  .setMaster("local[*]")  // local mode

// 2. Create Spark context
val sc = new SparkContext(conf)

SparkContext.getOrCreate()