在火花集群与akka演员之间进行通信时遇到的破坏者

时间:2015-03-05 19:12:29

标签: scala apache-spark akka

由于spark建立在Akka之上,我想使用Akka在spark簇之间发送和接收消息。

根据本教程https://github.com/jaceklaskowski/spark-activator/blob/master/src/main/scala/StreamingApp.scala,我可以在本地运行StreamingApp并将消息发送到actorStream本身。

然后我尝试将发件人部分附加到我的另一个spark master,并将来自spark master的消息发送到StreamingApp中的远程actor。代码如下

object SenderApp extends Serializable {

    def main(args: Array[String]) {

        val driverPort = 12345
        val driverHost = "xxxx"
        val conf = new SparkConf(false) 
            .setMaster("spark://localhost:8888") // Connecting to my spark master
            .setAppName("Spark Akka Streaming Sender")
            .set("spark.logConf", "true")
            .set("spark.akka.logLifecycleEvents", "true")
        val actorName = "helloer"

        val sc = new SparkContext(conf)

        val actorSystem = SparkEnv.get.actorSystem

        val url = s"akka.tcp://sparkDriver@$driverHost:$driverPort/user/Supervisor0/$actorName"

        val helloer = actorSystem.actorSelection(url)

        helloer ! "Hello"
        helloer ! "from"
        helloer ! "Spark Streaming"
        helloer ! "with"
        helloer ! "Scala"
        helloer ! "and"
        helloer ! "Akka"
    }
}

然后我从StreamingApp收到消息说它遇到了DeadLetters。 详细信息如下:

INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkDriver/deadLetters] to Actor[akka://sparkDriver/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkDriver%40111.22.33.444%3A56840-4#-2094758237] was not delivered. [5] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.

1 个答案:

答案 0 :(得分:1)

根据这篇文章: http://typesafe.com/activator/template/spark-streaming-scala-akka

我更改了helloer,现在可以使用了

val timeout = 100 seconds

val helloer = Await.result(actorSystem.actorSelection(url).resolveOne(timeout), 
                           timeout)