在Akka集群应用程序的上下文中,我遇到了一个关于Akka期望的一个属性的问题:每个(cas)类和使用的每个消息都必须是可序列化的。我有以下上下文:我想从redis集群中使用数据,为此,我决定采用集群感知路由器池来添加节点以拥有更多的工作者。工作人员从redis读取数据并将一些元数据存储在mongodb中。在第一个版本中,我这样做了:
object MasterWorkers {
def props
( awsBucket : String,
gapMinValueMicroSec : Long,
persistentCache: RedisCache,
mongoURI : String,
mongoDBName : String,
mongoCollectioName : String
) : Props =
Props(MasterWorkers(awsBucket, gapMinValueMicroSec, persistentCache, mongoURI, mongoDBName, mongoCollectioName))
case class JobRemove(deviceId: DeviceId, from : Timestamp, to : Timestamp)
}
case class MasterWorkers
(
awsBucket : String,
gapMinValueMicroSec : Long,
persistentCache: RedisCache,
mongoURI : String,
mongoDBName : String,
mongoCollectioName : String
) extends Actor with ActorLogging {
val workerRouter =
context.actorOf(FromConfig.props(Props(classOf[Worker],awsBucket,gapMinValueMicroSec, self, persistentCache, mongoURI, mongoDBName, mongoCollectioName)),
name = "workerRouter")
工人阶级:
object Worker {
def props
(
awsBucket : String,
gapMinValueMicroSec : Long,
replyTo : ActorRef,
persistentCache: RedisCache,
mongoURI : String,
mongoDBName : String,
mongoCollectioName : String
) : Props =
Props(Worker(awsBucket, gapMinValueMicroSec, replyTo, persistentCache, mongoURI, mongoDBName, mongoCollectioName))
case class JobDumpFailed(deviceId : DeviceId, from: Timestamp, to: Timestamp)
case class JobDumpSuccess(deviceId : DeviceId, from: Timestamp, to: Timestamp)
case class JobRemoveFailed(deviceId : DeviceId, from: Timestamp, to: Timestamp)
}
case class Worker
(
awsBucket : String,
gapMinValueMicroSec : Long,
replyTo : ActorRef,
persistentCache: RedisCache,
mongoURI : String,
mongoDBName : String,
mongoCollectioName : String
) extends Actor with ActorLogging {
但是当我启动两个节点时,这会引发以下异常:
[info] akka.remote.MessageSerializer$SerializationException: Failed to serialize remote message [class akka.remote.DaemonMsgCreate] using serializer [class akka.remote.serialization.DaemonMsgCreateSerializer].
[info] at akka.remote.MessageSerializer$.serialize(MessageSerializer.scala:61)
[info] at akka.remote.EndpointWriter$$anonfun$serializeMessage$1.apply(Endpoint.scala:895)
[info] at akka.remote.EndpointWriter$$anonfun$serializeMessage$1.apply(Endpoint.scala:895)
[info] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[info] at akka.remote.EndpointWriter.serializeMessage(Endpoint.scala:894)
[info] at akka.remote.EndpointWriter.writeSend(Endpoint.scala:786)
[info] at akka.remote.EndpointWriter$$anonfun$4.applyOrElse(Endpoint.scala:761)
[info] at akka.actor.Actor$class.aroundReceive(Actor.scala:497)
[info] at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:452)
[info] at akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
[info] at akka.actor.ActorCell.invoke(ActorCell.scala:495)
[info] at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
[info] at akka.dispatch.Mailbox.run(Mailbox.scala:224)
[info] at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
[info] at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
[info] at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
[info] at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
[info] at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
[info] Caused by: java.io.NotSerializableException: akka.actor.ActorSystemImpl
redis缓存是一个简单的案例类,其伴随对象实现了如下界面:
object RedisCache { // some static functions }
case class RedisCache
(
master : RedisServer,
slaves : Seq[RedisServer]
)(implicit actorSystem : ActorSystem)
extends PersistentCache[DeviceKey, BCPPackets] with LazyLogging {
// some code here
}
然后为了解决这个问题,我在工作中移动了redisCache
,而我没有将它交给主节点:
case class Worker
(
awsBucket : String,
gapMinValueMicroSec : Long,
replyTo : ActorRef,
mongoURI : String,
mongoDBName : String,
mongoCollectioName : String
) extends Actor with ActorLogging {
// redis cache here now
val redisCache = ...
但是有了这样的设计,每个routee都会创建一个新的redis缓存实例,而这不是预期的行为。我想要的是拥有我的redis缓存的一个实例,然后与我的所有路由共享它,但在集群应用程序的上下文中,似乎不可能,所以我不知道这是设计失败还是缺少一些经验与Akka。如果有人遇到类似问题,我会高兴地提出建议!