使用Akka Streams读取大文件

时间:2016-02-05 11:01:57

标签: scala akka-stream

我尝试了Akka Streams,这是我的简短片段:

  override def main(args: Array[String]) {
    val filePath = "/Users/joe/Softwares/data/FoodFacts.csv"//args(0)

    val file = new File(filePath)
    println(file.getAbsolutePath)
    // read 1MB of file as a stream
    val fileSource = SynchronousFileSource(file, 1 * 1024 * 1024)
    val shaFlow = fileSource.map(chunk => {
      println(s"the string obtained is ${chunk.toString}")
    })
    shaFlow.to(Sink.foreach(println(_))).run // fails with a null pointer

    def sha256(s: String) = {
      val  messageDigest = MessageDigest.getInstance("SHA-256")
      messageDigest.digest(s.getBytes("UTF-8"))
    }
  }

当我运行此代码段时,我得到:

Exception in thread "main" java.lang.NullPointerException
    at akka.stream.scaladsl.RunnableGraph.run(Flow.scala:365)
    at com.test.api.consumer.DataScienceBoot$.main(DataScienceBoot.scala:30)
    at com.test.api.consumer.DataScienceBoot.main(DataScienceBoot.scala)

在我看来,fileSource不是空的吗?为什么是这样?有任何想法吗? FoodFacts.csv如果大小为40MB,我试图做的就是创建一个1MB的数据流!

即使使用defaultChunkSize 8192也不起作用!

1 个答案:

答案 0 :(得分:4)

1.0已被弃用。如果可以,请使用2.x

当我使用2.0.1代替FileIO.fromFile(file)尝试使用SynchronousFileSource版本时,编译失败并显示消息fails with null pointer。这只是因为它在范围内没有ActorMaterializer。包括它,使它工作:

object TImpl extends App {
import java.io.File

  implicit val system = ActorSystem("Sys")
  implicit val materializer = ActorMaterializer()

  val file = new File("somefile.csv")
  val fileSource = FileIO.fromFile(file,1 * 1024 * 1024 )
  val shaFlow: Source[String, Future[Long]] = fileSource.map(chunk => {
    s"the string obtained is ${chunk.toString()}"
  })

  shaFlow.runForeach(println(_))    
}

这适用于任何大小的文件。有关调度程序配置的更多信息,请参阅here