在同一应用程序中使用Akka触发1.6

时间:2019-02-20 16:53:10

标签: scala apache-spark akka

我正在开发一个涉及一些ETL作业的应用程序,我正在尝试使用spark 1.6来实现。还有一些我用akka http实现的Web服务器。这两块都可以单独正常工作,但是当我在单个应用程序中移动它时,由于以下原因,火花开始失败:

Exception in thread "main" java.lang.NoSuchMethodError: akka.actor.LocalActorRefProvider.log()Lakka/event/LoggingAdapter;
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:128)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(ReflectiveDynamicAccess.scala:33)
at scala.util.Try$.apply(Try.scala:192)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(ReflectiveDynamicAccess.scala:28)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(ReflectiveDynamicAccess.scala:39)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(ReflectiveDynamicAccess.scala:39)
at scala.util.Success.flatMap(Try.scala:231)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(ReflectiveDynamicAccess.scala:39)
at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:795)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:788)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:246)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:289)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:264)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2024)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2015)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)

这是我的依赖项:

"org.apache.spark" %% "spark-sql" % "1.6.2",
"org.apache.spark" %% "spark-hive" % "1.6.2",
"com.databricks" %% "spark-csv" % "1.5.0",
"com.typesafe.akka" %% "akka-actor" % "2.5.19",
"com.typesafe.akka" %% "akka-stream" % "2.5.19",
"com.typesafe.akka" %% "akka-http" % "10.1.3",
"com.typesafe.akka" %% "akka-http-spray-json" % "10.1.3"

示例代码:

object AppStarter {  

  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local[4]")
    val sc = new SparkContext(conf) // FAIL happens here
  }

}

我尝试使用akka-native的版本设置,但这无济于事,但每次都会遇到不同的错误。有什么办法可以迫使spark忽略同一范围内的akka​​依赖项?

1 个答案:

答案 0 :(得分:-2)

您似乎要使用spark Sql,因此更喜欢使用SparkSession.builder

SparkSession.builder()
  .master("local[4]")
  .appName("Spark recommendation")
  .getOrCreate()

您可以在此处查看文档:{​​{3}}

您可以使用以下命令传递更多配置参数: 例如.config("spark.executor.memory", "4g")