我有一个2015-07-17 01:45:46
(WHERE date > ".date("Y-m-d H:i:s")." - INTERVAL 2 DAY
)来自Mahout生成的序列文件,我希望将其转换为Vector(VectorWritable
)类型为Spark。我怎么能在Scala中做到这一点?
答案 0 :(得分:1)
假设我们previous question中有import scala.collection.JavaConverters.iterableAsScalaIterableConverter
def mahoutToScala(v: org.apache.mahout.math.VectorWritable) = {
val scalaArray = v.get.all.asScala.map(_.get).toArray
org.apache.spark.mllib.linalg.Vectors.dense(scalaArray)
}
rdd.map{ case (k, v) => (k.toString, mahoutToScala(v))}
。
biscect