With Slick you can do the following to produce a stream of results from a table:
val q = for (e <- events) yield e.name
val p: DatabasePublisher[String] = db.stream(q.result)
p.foreach { s => println(s"Event: $s") }
That will print all the events in the events
table and terminate after the last row.
Assuming you can be notified in some way of when new rows are entered into the events
table, is it possible to write a stream that would continuously output events as they were inserted? A sort of tail -f
for a DB table.
I think Slick won't support this natively, but I think it should be possible to use Akka streaming to help. So if you could have something that took from the Slick Source until it was empty, then waited for an event to indicate more data in the table, then streamed the new data. Possibly by using an ActorPublisher
to bind this logic?
Just wondering if someone has any experience in this area or any advice?
答案 0 :(得分:4)
你对ActorPublisher
:)的看法是正确的。这是一个使用PostgreSQL,async DB driver和LISTEN/NOTIFY mechanism的简单示例:
演员:
class PostgresListener extends ActorPublisher[String] {
override def receive = {
case _ ⇒
val configuration = URLParser.parse(s"jdbc://postgresql://$host:$port/$db?user=$user&password=$password")
val connection = new PostgreSQLConnection(configuration)
Await.result(connection.connect, 5.seconds)
connection.sendQuery(s"LISTEN $channel")
connection.registerNotifyListener { message ⇒ onNext(message.payload) }
}
}
服务:
def stream: Source[ServerSentEvent, Unit] = {
val dataPublisherRef = Props[PostgresListener]
val dataPublisher = ActorPublisher[String](dataPublisherRef)
dataPublisherRef ! "go"
Source(dataPublisher)
.map(ServerSentEvent(_))
.via(WithHeartbeats(10.second))
}
build.sbt
中的 libraryDependencies
:
"com.github.mauricio" %% "postgresql-async" % "0.2.18"
Postgres触发器应调用select pg_notify('foo', 'payload')
据我所知,Slick不支持LISTEN
。
答案 1 :(得分:0)
ActorPublisher
自Akka 2.5.0起已被弃用。这是使用postgresql-async库并在actor内部创建SourceQueue
的替代方法:
import akka.actor._
import akka.stream._
import akka.stream.scaladsl._
import com.github.mauricio.async.db.postgresql.PostgreSQLConnection
import com.github.mauricio.async.db.postgresql.util.URLParser
import scala.concurrent.duration._
import scala.concurrent.Await
class DbActor(implicit materializer: ActorMaterializer) extends Actor with ActorLogging {
private implicit val ec = context.system.dispatcher
val queue =
Source.queue[String](Int.MaxValue, OverflowStrategy.backpressure)
.to(Sink.foreach(println))
.run()
val configuration = URLParser.parse("jdbc:postgresql://localhost:5233/my_db?user=dbuser&password=pwd")
val connection = new PostgreSQLConnection(configuration)
Await.result(connection.connect, 5 seconds)
connection.sendQuery("LISTEN my_channel")
connection.registerNotifyListener { message =>
val msg = message.payload
log.debug("Sending the payload: {}", msg)
self ! msg
}
def receive = {
case payload: String =>
queue.offer(payload).pipeTo(self)
case QueueOfferResult.Dropped =>
log.warning("Dropped a message.")
case QueueOfferResult.Enqueued =>
log.debug("Enqueued a message.")
case QueueOfferResult.Failure(t) =>
log.error("Stream failed: {}", t.getMessage)
case QueueOfferResult.QueueClosed =>
log.debug("Stream closed.")
}
}
上面的代码仅在发生时打印来自PostgreSQL的通知;您可以将Sink.foreach(println)
替换为另一个Sink
。要运行它:
import akka.actor._
import akka.stream.ActorMaterializer
object Example extends App {
implicit val system = ActorSystem()
implicit val materializer = ActorMaterializer()
system.actorOf(Props(classOf[DbActor], materializer))
}