在intellij:'akka.version`中交互式运行Spark

时间:2015-01-06 15:40:31

标签: scala intellij-idea apache-spark

我试图在Intellij的Scala工作表中运行Spark,但收到错误No configuration setting found for key 'akka.version'

工作表内容:

import org.apache.spark.SparkContext
val sc1 = new SparkContext("local[8]", "sc1")

完整堆栈跟踪:

import org.apache.spark.SparkContext
15/01/06 16:30:32 INFO spark.SecurityManager: Changing view acls to: tobber
15/01/06 16:30:32 INFO spark.SecurityManager: Changing modify acls to: tobber
15/01/06 16:30:32 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tobber); users with modify permissions: Set(tobber)
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
    at com.typesafe.config.impl.SimpleConfig.findKey(spark.sc0.tmp:111)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:132)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:138)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:146)
    at com.typesafe.config.impl.SimpleConfig.find(spark.sc0.tmp:151)
    at com.typesafe.config.impl.SimpleConfig.getString(spark.sc0.tmp:193)
    at akka.actor.ActorSystem$Settings.<init>(spark.sc0.tmp:132)
    at akka.actor.ActorSystemImpl.<init>(spark.sc0.tmp:466)
    at akka.actor.ActorSystem$.apply(spark.sc0.tmp:107)
    at akka.actor.ActorSystem$.apply(spark.sc0.tmp:100)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(spark.sc0.tmp:117)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(spark.sc0.tmp:50)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(spark.sc0.tmp:49)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(spark.sc0.tmp:1500)
    at scala.collection.immutable.Range.foreach$mVc$sp(spark.sc0.tmp:137)
    at org.apache.spark.util.Utils$.startServiceOnPort(spark.sc0.tmp:1491)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(spark.sc0.tmp:52)
    at org.apache.spark.SparkEnv$.create(spark.sc0.tmp:149)
    at org.apache.spark.SparkContext.<init>(spark.sc0.tmp:200)
    at org.apache.spark.SparkContext.<init>(spark.sc0.tmp:115)
    at apps.A$A1$A$A1.sc$lzycompute(spark.sc0.tmp:2)
    at apps.A$A1$A$A1.sc(spark.sc0.tmp:2)
    at apps.A$A1$A$A1.get$$instance$$sc(spark.sc0.tmp:2)
    at #worksheet#.#worksheet#(spark.sc0.tmp:9)

1 个答案:

答案 0 :(得分:0)

解决方案是改为使用Scala Console

在您的spark项目中,只需在scala文件中按Ctrl + Shift + DCmd + Shift + D即可。粘贴代码并使用Ctrl + EnterCmd + Enter

运行代码