Spark(1.6)Densevector.type不接受参数

时间:2017-10-26 12:18:04

标签: scala apache-spark apache-spark-mllib

我跟着这篇文章:

http://learningfrombigdata.com/semantic-similarity-between-sentences-using-apache-spark/

然而,当我到达这一部分时:

def distance(lvec: String, rvec: String): Double = {
    val l = DenseVector(lvec.split(',').map(_.toDouble))
    val r = DenseVector(rvec.split(',').map(_.toDouble))
    math.sqrt(sum((l - r) :* (l - r)))
}

我收到以下错误:

Name: Compile Error
Message: <console>:177: error: org.apache.spark.mllib.linalg.DenseVector.type does not take parameters
           val l = DenseVector(lvec.split(',').map(_.toDouble))
                              ^
<console>:178: error: org.apache.spark.mllib.linalg.DenseVector.type does not take parameters
           val r = DenseVector(rvec.split(',').map(_.toDouble))

我怀疑它可能与版本有关(我使用的是Spark 1.6.0),但我不确定,并且无法在线找到有关此错误的更多信息,不胜感激任何帮助< / p>

1 个答案:

答案 0 :(得分:1)

缺少new

scala> import org.apache.spark.mllib.linalg.DenseVector
import org.apache.spark.mllib.linalg.DenseVector

scala> new DenseVector(Array(1, 2, 3))
res1: org.apache.spark.mllib.linalg.DenseVector = [1.0,2.0,3.0]