scala中“double”值的乘法

时间:2017-10-22 22:11:09

标签: scala apache-spark

我想用scala将两个稀疏矩阵乘以spark。我以参数的形式传递这些矩阵并将结果存储在另一个参数中。 矩阵是文本文件,其中每个矩阵元素由as:row,column,element表示。

我无法在Scala中乘以两个Double值。

object MultiplySpark {
    def main(args: Array[ String ]) {
        val conf = new SparkConf().setAppName("Multiply")
        conf.setMaster("local[2]")
        val sc = new SparkContext(conf)

        val M = sc.textFile(args(0)).flatMap(entry => {
        val rec = entry.split(",")
        val row = rec(0).toInt
        val column = rec(1).toInt
        val value = rec(2).toDouble

        for {pointer <-1 until rec.length} yield ((row,column),value)
        })

            val N = sc.textFile(args(0)).flatMap(entry => {
        val rec = entry.split(",")
        val row = rec(0).toInt
        val column = rec(1).toInt
        val value = rec(2).toDouble

        for {pointer <-1 until rec.length} yield ((row,column),value)
        })


         val Mmap = M.map( e => (e._2,e))
     val Nmap = N.map( d => (d._2,d))

     val MNjoin = Mmap.join(Nmap).map{ case (k,(e,d)) => e._2.toDouble+","+d._2.toDouble }

    val result = MNjoin.reduceByKey( (a,b) => a*b)
  .map(entry => {
    ((entry._1._1, entry._1._2), entry._2)
  })
  .reduceByKey((a, b) => a + b)

  result.saveAsTextFile(args(2))
  sc.stop()

如何在Scala中乘以double值? 请注意: 我试过了a.toDouble * b.toDouble

错误是:值*不是Double Double的成员

enter image description here

1 个答案:

答案 0 :(得分:1)

如果back()(或更reduceByKey更多)RDD[((Int, Int), Double)]给您RDD[(SomeType, Double)],则此join会有效。因此,您要尝试将RDD[((Int, Int), (Double, Double))]对,而不是(Double, Double) s。