IntelliJ中SBT项目的未解析依赖路径

时间:2017-06-10 12:46:38

标签: scala intellij-idea sbt spark-streaming

我使用IntelliJ开发Spark应用程序。我关注如何使intellij与SBT项目很好地合作instruction

由于我的整个团队都在使用IntelliJ,所以我们可以修改build.sbt,但是我们得到了这个未解决的依赖项错误

错误:导入SBT项目时出错:

[info] Resolving org.apache.thrift#libfb303;0.9.2 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-streaming_2.10;2.1.0 ...
[info] Resolving org.apache.spark#spark-parent_2.10;2.1.0 ...
[info] Resolving org.scala-lang#jline;2.10.6 ...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]      sparrow-to-orc:sparrow-to-orc_2.10:0.1
[warn]        +- mainrunner:mainrunner_2.10:0.1-SNAPSHOT
[trace] Stack trace suppressed: run 'last mainRunner/:ssExtractDependencies' for the full output.
[trace] Stack trace suppressed: run 'last mainRunner/:update' for the full output.
[error] (mainRunner/:ssExtractDependencies) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] (mainRunner/:update) sbt.ResolveException: unresolved dependency: sparrow-to-orc#sparrow-to-orc_2.10;0.1: not found
[error] Total time: 47 s, completed Jun 10, 2017 8:39:57 AM

这是我的build.sbt

name := "sparrow-to-orc"

version := "0.1"

scalaVersion := "2.11.8"

lazy val sparkDependencies = Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0",
  "org.apache.spark" %% "spark-sql" % "2.1.0",
  "org.apache.spark" %% "spark-hive" % "2.1.0",
  "org.apache.spark" %% "spark-streaming" % "2.1.0"
)

libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"

libraryDependencies ++= sparkDependencies.map(_ % "provided")

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

assemblyMergeStrategy in assembly := {
  case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
  case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
  case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
  case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("com", "google", xs @ _*) => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
  case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
  case "about.html" => MergeStrategy.rename
  case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
  case "META-INF/mailcap" => MergeStrategy.last
  case "META-INF/mimetypes.default" => MergeStrategy.last
  case "plugin.properties" => MergeStrategy.last
  case "log4j.properties" => MergeStrategy.last
  case "overview.html" => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))

如果我没有这条线,那么该程序可以正常运行

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
  libraryDependencies ++= sparkDependencies.map(_ % "compile")
)

但后来我无法在IntelliJ中运行应用程序,因为火花依赖性不会被包含在类路径中。

1 个答案:

答案 0 :(得分:0)

我有同样的问题。解决方案是将mainRunner中的Scala版本设置为与build.sbt文件顶部声明的版本相同:

lazy val mainRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
    libraryDependencies ++= sparkDependencies.map(_ % "compile"),
    scalaVersion := "2.11.8"
)

祝你好运!