构建Spark 1.3.0 JDK 1.6.0_45 maven 3.0.5 CentOS 6时出错

时间:2015-04-09 20:33:37

标签: java scala maven apache-spark spark-streaming

当我尝试使用包中添加的依赖项构建Spark 1.3.0时 我收到与课程不匹配有关的错误

`[warn] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:23: imported `Clock' is permanently hidden by definition of trait Clock in package spark
[warn] import org.apache.spark.util.{SystemClock, Clock}
[warn]                                            ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:127: type mismatch;
[error]  found   : org.apache.spark.util.SystemClock
[error]  required: org.apache.spark.Clock
[error]   private var clock: Clock = new SystemClock()
[error]                              ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala:66: reference to Clock is ambiguous;
[error] it is imported twice in the same scope by
[error] import org.apache.spark.util._
[error] and import org.apache.spark._
[error]     clock: Clock = new SystemClock())
[error]            ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:34: imported `Clock' is permanently hidden by definition of trait Clock in package worker
[warn] import org.apache.spark.util.{Clock, SystemClock}
[warn]                               ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:61: type mismatch;
[error]  found   : org.apache.spark.util.SystemClock
[error]  required: org.apache.spark.deploy.worker.Clock
[error]   private var clock: Clock = new SystemClock()
[error]                              ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:190: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error]       val processStart = clock.getTimeMillis()
[error]                                ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:192: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error]       if (clock.getTimeMillis() - processStart > successfulRunDuration * 1000) {
[error]                 ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:37: imported `MutableURLClassLoader' is permanently hidden by definition of trait MutableURLClassLoader in package executor
[warn] import org.apache.spark.util.{ChildFirstURLClassLoader, MutableURLClassLoader,
[warn]                                                         ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:319: type mismatch;
[error]  found   : org.apache.spark.util.ChildFirstURLClassLoader
[error]  required: org.apache.spark.executor.MutableURLClassLoader
[error]       new ChildFirstURLClassLoader(urls, currentLoader)
[error]       ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:321: trait MutableURLClassLoader is abstract; cannot be instantiated
[error]       new MutableURLClassLoader(urls, currentLoader)
[error]       ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/local/LocalBackend.scala:89: postfix operator millis should be enabled
[warn] by making the implicit value scala.language.postfixOps visible.
[warn] This can be achieved by adding the import clause 'import scala.language.postfixOps'
[warn] or by setting the compiler option -language:postfixOps.
[warn] See the Scala docs for value scala.language.postfixOps for a discussion
[warn] why the feature should be explicitly enabled.
[warn]       context.system.scheduler.scheduleOnce(1000 millis, self, ReviveOffers)
[warn]                                                  ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/util/MutableURLClassLoader.scala:26: imported `ParentClassLoader' is permanently hidden by definition of class ParentClassLoader in package util
[warn] import org.apache.spark.util.ParentClassLoader
[warn]                              ^
[warn] 5 warnings found
[error] 7 errors found
`

在尝试构建包含的maven + jdk 1.7

时,我遇到了同样的错误

完整版本输出位于pastebin id i9PFEVJ8上 完整的pom.xml pastebin id 8gEgT5EE

[增订]

我已经将spark版本更改为匹配1.3.0现在我得到循环依赖性错误。    <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>1.3.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka_2.10</artifactId> <version>1.3.0</version> </dependency>

1 个答案:

答案 0 :(得分:0)

我意识到kafka流模块附带了针对MapR 3.x的预构建Spark 1.3.0,如果需要在其应用程序中生成流,则需要模块和依赖项。