字段"项目"使用Spark MLlib管道进行ALS

时间:2015-05-09 14:14:38

标签: scala apache-spark apache-spark-mllib

我正在使用ALS(Spark版本:1.3.1)培训推荐系统。现在我想通过交叉验证使用Pipeline进行模型选择。作为第一步,我尝试调整the example code并想出了这个:

val conf = new SparkConf().setAppName("ALS").setMaster("local")
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._

val ratings: RDD[org.apache.spark.mllib.recommendation.Rating] = // ...
val als = new ALS().setMaxIter(10).setRank(10).setRegParam(0.01)
val pipeline = new Pipeline().setStages(Array(als))
val model = pipeline.fit(ratings.toDF)

当我运行它时,最后一行失败并出现异常:

Exception in thread "main" java.lang.IllegalArgumentException: Field "item" does not exist.
at org.apache.spark.sql.types.StructType$$anonfun$apply$25.apply(dataTypes.scala:1032)
at org.apache.spark.sql.types.StructType$$anonfun$apply$25.apply(dataTypes.scala:1032)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
at scala.collection.AbstractMap.getOrElse(Map.scala:58)
at org.apache.spark.sql.types.StructType.apply(dataTypes.scala:1031)
at org.apache.spark.ml.recommendation.ALSParams$class.validateAndTransformSchema(ALS.scala:148)
at org.apache.spark.ml.recommendation.ALS.validateAndTransformSchema(ALS.scala:229)
at org.apache.spark.ml.recommendation.ALS.transformSchema(ALS.scala:304)
at org.apache.spark.ml.Pipeline$$anonfun$transformSchema$4.apply(Pipeline.scala:142)
at org.apache.spark.ml.Pipeline$$anonfun$transformSchema$4.apply(Pipeline.scala:142)
at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51)
at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60)
at scala.collection.mutable.ArrayOps$ofRef.foldLeft(ArrayOps.scala:108)
at org.apache.spark.ml.Pipeline.transformSchema(Pipeline.scala:142)
at org.apache.spark.ml.PipelineStage.transformSchema(Pipeline.scala:58)
at org.apache.spark.ml.Pipeline.fit(Pipeline.scala:100)
at org.apache.spark.ml.Pipeline.fit(Pipeline.scala:79)
at org.apache.spark.ml.Estimator.fit(Estimator.scala:44)
...

我的代码中没有使用字符串"item",因此我认为它是某种默认值。当我将.setItemCol("itemId")添加到als时,异常消息会相应更改。

"item"的含义是什么?如何使管道工作?

1 个答案:

答案 0 :(得分:1)

好的,解决方案实际上非常简单:使用org.apache.spark.ml.recommendation.ALS.Rating而不是org.apache.spark.mllib.recommendation.Rating,它只会起作用。

否则.setItemCol("product")可以解决问题,因为org.apache.spark.mllib.recommendation.Rating有一个名为“product”的字段,而org.apache.spark.ml.recommendation.ALS.Rating则调用相应的字段“item”。必须有一些魔法,给定一个字符串,访问案例类的某些字段(反射?)。