克隆SBT仓库并尝试在目录中启动SBT shell后,我收到以下错误
java.lang.IllegalArgumentException: requirement failed: Configurations already specified for module com.holdenkarau:spark-testing-base:2.2.0_0.7.2:test
完整的堆栈跟踪如下所示
[error] java.lang.IllegalArgumentException: requirement failed: Configurations already specified for module com.holdenkarau:spark-testing-base:2.2.0_0.7.2:test
[error] at scala.Predef$.require(Predef.scala:277)
[error] at sbt.librarymanagement.DependencyBuilders.moduleIDConfigurable(DependencyBuilders.scala:30)
[error] at sbt.librarymanagement.DependencyBuilders.moduleIDConfigurable$(DependencyBuilders.scala:29)
[error] at sbt.package$.moduleIDConfigurable(package.scala:6)
[error] at $080896ebbef320cbbd4a$.$anonfun$$sbtdef$2(/Users/username/company/repo/submodule/build.sbt:37)
[error] at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:234)
[error] at scala.collection.immutable.List.foreach(List.scala:389)
[error] at scala.collection.TraversableLike.map(TraversableLike.scala:234)
[error] at scala.collection.TraversableLike.map$(TraversableLike.scala:227)
[error] at scala.collection.immutable.List.map(List.scala:295)
[error] at $080896ebbef320cbbd4a$.$anonfun$$sbtdef$1(/Users/username/company/repo/submodule/build.sbt:37)
[error] at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:197)
[error] at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:214)
[error] at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:159)
[error] at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:82)
[error] at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:93)
[error] at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:89)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] java.lang.IllegalArgumentException: requirement failed: Configurations already specified for module com.holdenkarau:spark-testing-base:2.2.0_0.7.2:test
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?
这个错误的原因是什么以及如何克服它?
我的项目配置是
UPDATE-1
这是我的build.sbt
文件
import AwsDependencies._
import Dependencies._
import SparkDependencies._
version := "0.0.1"
// core settings
organization := "com.company"
scalaVersion := "2.11.11"
// cache options
offline := false
updateOptions := updateOptions.value.withCachedResolution(true)
// aggregate options
aggregate in assembly := false
aggregate in update := false
// fork options
fork in Test := true
name := "Submodule"
version := "0.0.1"
//common libraryDependencies
libraryDependencies ++= Seq(
scalaTest,
typesafeConfig,
jodaTime,
mysql,
json,
scopt,
awsS3,
sparkTesting
)
libraryDependencies ++= SparkDependencies.allSparkDependencies.map(_ % "provided")
assemblyMergeStrategy in assembly := {
case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
case m if m.startsWith("META-INF") => MergeStrategy.discard
case PathList("javax", "servlet", _@_*) => MergeStrategy.first
case PathList("org", "apache", _@_*) => MergeStrategy.first
case PathList("org", "jboss", _@_*) => MergeStrategy.first
case "about.html" => MergeStrategy.rename
case "reference.conf" => MergeStrategy.concat
case "application.conf" => MergeStrategy.concat
case _ => MergeStrategy.first
}
堆栈跟踪报告此build.sbt
(相关子模块)的以下行的错误
libraryDependencies ++= SparkDependencies.allSparkDependencies.map(_ % "provided")
答案 0 :(得分:2)
我知道这个答案可能有点晚了,但看起来你的SparkDependencies.allSparkDependencies中的一个条目已经包含% provided
,所以SparkDependencies.allSparkDependencies.map(_ % "provided")
正试图再次添加它,导致问题。尝试从SparkDependencies.allSparkDependencies中删除% provided
。
答案 1 :(得分:1)
我在不同的设置中遇到了同样的问题。
就我而言,问题是由于在我的 sbt 设置中的两个不同位置有 test
说明符引起的。在我的Dependencies.scala
中:
object Dependencies {
lazy val someLibrary = "org.foo" %% "someLibrary" % "1.0.0" % "test"
}
在build.sbt
中:
lazy val rool = (project in file("."))
.settings(
lbraryDependencies += someLibrary % Test
)
一旦我从依赖项的 val 表达式中删除了 % "test"
,问题就解决了。