我试图通过使用sbt来构建并发症。
教程:http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications
但错误发生了。
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:22:object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.evaluation.RegressionEvaluator
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:23: object ml is not a member of package org.apache.spark
[error] import org.apache.spark.ml.recommendation.ALS
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:25: object sql is not a member of package org.apache.spark
[error] import org.apache.spark.sql.SparkSession
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:46: not found: value SparkSession
[error] val spark = SparkSession
[error] ^
[error] /home/zeng/workspace/spark/als/src/main/scala/ALSExample.scala:61: not found: type ALS
[error] val als = new ALS()
[error] ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
为什么会这样? BTW,spark verion是2.0.0。
答案 0 :(得分:5)
所以就像怀疑一样,这个错误反映的是你没有在你的构建文件中包含所有的spark库,你缺少的是(是吗?):
"org.apache.spark" %% "spark-mllib" % "2.0.0"
如果您使用的是Dataframes,则还需要:
"org.apache.spark" %% "spark-sql" % "2.0.0"