为什么IntelliJ IDEA调试器跳转到错误的Scala版本库?

时间:2014-06-02 15:31:34

标签: scala intellij-idea sbt

我在Windows 7上使用带有sbt 0.13.1的Scala插件版本0.36.431的IntelliJ IDEA 13.1.2。

以下项目定义build.sbt没有引用除 2.9.3 以外的任何Scala版本。

import sbt._
import Keys._
import AssemblyKeys._
import NativePackagerKeys._

name := "simplews"

version      := "0.1.0-SNAPSHOT"

val sparkVersion = "0.8.1-incubating"

scalaVersion := "2.9.3"

val akkaVersion = "2.0.5"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.9.3" % sparkVersion  % "compile->default"  withSources(),
  "org.apache.spark" % "spark-examples_2.9.3" % sparkVersion  % "compile->default"  withSources(),
  "org.apache.spark" % "spark-tools_2.9.3" % sparkVersion  % "compile->default"  withSources(),
  "org.scalatest" % "scalatest_2.9.3" % "1.9.2" % "test"  withSources(),
  "org.apache.spark" % "spark-repl_2.9.3" % sparkVersion  % "compile->default"  withSources(),
  "org.apache.kafka" % "kafka" % "0.7.2-spark",
  "com.thenewmotion.akka" % "akka-rabbitmq_2.9.2" % "0.0.2" % "compile->default"  withSources(),
  "com.typesafe.akka" % "akka-actor" % akkaVersion % "compile->default" withSources(),
  "com.typesafe.akka" % "akka-testkit" % akkaVersion % "compile->default" withSources(),
  "com.rabbitmq" % "amqp-client" % "3.0.1" % "compile->default"  withSources(),
  "org.specs2" % "specs2_2.9.3" % "1.12.4.1" % "compile->default"  withSources(),
  "com.nebhale.jsonpath" % "jsonpath" % "1.2" % "compile->default"  withSources(),
  "org.mockito" % "mockito-all" % "1.8.5",
  "junit" % "junit" % "4.11"
)

packagerSettings

packageArchetype.java_application

resolvers  ++= Seq(
  "Apache repo" at "https://repository.apache.org/content/repositories/releases",
  "Cloudera repo" at "https://repository.cloudera.com/artifactory/repo/org/apache/kafka/kafka/0.7.2-spark/",
  "akka rabbitmq" at "http://nexus.thenewmotion.com/content/repositories/releases-public",
  "Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
  Resolver.mavenLocal
)

然而,如截图中所示,调试器已跳转到scala 2.10.2 。注意:调试器正确地进入 2.9.3 进行其他一些调试。

这是 project / plugins.sbt

resolvers += "sbt-plugins" at "http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"

addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-RC2")

addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")

编辑为了重现,有必要在一个或两个库中进行mvn本地安装,这些库在任何公共存储库中都不可用。

mvn org.apache.maven.plugins:maven-install-plugin:2.5.1:install-file  -Dfile=c:\shared\kafka-0.7.2-spark.jar  -DgroupId=org.apache.kafka -DartifactId=kafka -Dversion=0.7.2-spark  -Dpackaging=jar

我在任何情况下都没有考虑过某人(om-nom-nom!)会尝试精确复制 - 所以也省略了其他无关的项目,如mergeStrategy和assemblyKeys。

一个完全独立的可重复设置可能会有所改进 - 我在这里一直受到相当大的要求。

enter image description here

0 个答案:

没有答案