IntelliJ sbt未解决的依赖项

时间:2019-07-05 07:54:27

标签: intellij-idea sbt intellij-plugin

我在IntelliJ 2019.1。中导入了一个scala sbt项目。在两台独立的计算机上进行操作:
-具有典型Internet连接的计算机1-我们称其为“正常”
-没有互联网连接的计算机2-将其命名为“盲人”

在“普通”和“盲”计算机上,将同时安装具有相同版本的IntelliJ和scala插件。在“普通”计算机上运行时,一切都会很好-能够从IntelliJ sbt外壳中编译,清理和汇编代码。在“盲人”计算机上运行时,我无法构建项目-在sbt控制台中出现未解决的依赖项错误。我已经将依赖项从~/.sbt~/.ivy2~/.m2目录也从“普通”计算机移到了“盲目”计算机。

错误消息:

[info] Loading settings from idea.sbt ... 
[info] Loading global plugin from /home/myuser/.sbt/1.0/plugins
[info] Updating {file:/home/tomaszk/.sbt/1.0/plugins/}global-plugins...
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] :: io.github.sugakandrey#scala-compiler-indices-protocol_2.12;0.1.1: Resolution failed several times for dependency: io.github.sugakandrey#scala-compiler-indices-protocol_2.12;0.1.1 {compile=[default(compile)]}
...
[error] sbt.librarymanagement.ResolveException: unresolved dependency: io.github.sugakandrey#scala-compiler-indices-protocol_2.12...

build.properties

sbt.version=1.0.3

build.sbt

name := "my_proj"
version := "1.0"

scalaVersion := "2.10.5"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.3" % "provided"
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.6.3" % "provided"
libraryDependencies += "joda-time" % "joda-time" % "2.10.1"
libraryDependencies += "com.databricks" % "spark-csv_2.10" % "1.5.0"
libraryDependencies += "com.typesafe" % "config" % "1.3.1"
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyJarName in assembly := "my_proj.jar"

我应该更改sbt版本还是移动另一个目录,这会导致某些依赖性,或者该问题与另一个问题有关?

此主题与我之前的问题有关:Use Scala on computer without internet connection

  • TK

1 个答案:

答案 0 :(得分:0)

@MarioGalic表示,问题在于scala和sbt版本-必须将IntelliJ Scala插件从2019.1.9降级到2019.1.3以解决此问题。