scala控制台错误:对象apache不是包org的成员

时间:2015-04-08 13:24:42

标签: scala apache-spark

我正在尝试这里提出的代码: http://spark.apache.org/docs/1.2.1/mllib-ensembles.html#classification

使用Scala控制台(Scala版本= Scala代码运行版本2.10.4),并收到以下错误:

scala> import org.apache.spark.mllib.tree.RandomForest
<console>:8: error: object apache is not a member of package org
           import org.apache.spark.mllib.tree.RandomForest
                      ^

然后我按照here的建议尝试构建一个简单的自包含应用程序,但遇到了另一个问题:

root@sd:~/simple# sbt package
[info] Set current project to Simple Project (in build file:/root/simple/)
[info] Updating {file:/root/simple/}default-c5720e...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
[info] Resolving org.apache.spark#spark-core_2.10.4;1.2.0 ...
[warn]  module not found: org.apache.spark#spark-core_2.10.4;1.2.0
[warn] ==== local: tried
[warn]   /root/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.2.0/ivys/ivy.xml
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.2.0/spark-core_2.10.4-1.2.0.pom
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10.4;1.2.0: not found

任何人都可以建议我可以尝试一下吗?

1 个答案:

答案 0 :(得分:3)

您可以在this post中找到如何使用Scala中的SBT编写自包含Spark应用程序的详细步骤。在sbt配置文件中,您应该指定依赖库。

libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % "1.2.1",
"org.apache.spark" % "spark-mllib_2.10" % "1.2.1")

然后使用以下命令

进行编译
sbt package