如何在sbt publish-local之后引用jar文件

时间:2014-06-27 00:19:52

标签: scala sbt apache-spark

火花罐已成功发布到本地存储库:

sbt publish-local

以下是火花核心的摘录 - 事情看起来很健康:

  

[info]发布了spark-core_2.10   C:\ Users \用户s80035683.m2 \库\组织\阿帕奇\火花\火花core_2.10 \ 1.1.0-SNAPSHOT \火花core_2.10-1.1.0-快照javadoc.jar   [info]发布spark-core_2.10到   C:\ Users \用户s80035683.ivy2 \本地\ org.apache.spark \火花core_2.10 \ 1.1.0-SNAPSHOT \ \ POMS火花core_2.10.pom   [info]发布spark-core_2.10到   C:\ Users \用户s80035683.ivy2 \本地\ org.apache.spark \火花core_2.10 \ 1.1.0-SNAPSHOT \罐\火花core_2.10.jar   [info]发布spark-core_2.10到   C:\ Users \用户s80035683.ivy2 \本地\ org.apache.spark \火花core_2.10 \ 1.1.0-SNAPSHOT \索马里红新月\火花core_2.10-sources.jar   [info]发布spark-core_2.10到   C:\ Users \用户s80035683.ivy2 \本地\ org.apache.spark \火花core_2.10 \ 1.1.0-SNAPSHOT \文档\火花core_2.10-javadoc.jar   [info]发布了常春藤   C:\ Users \用户s80035683.ivy2 \本地\ org.apache.spark \火花core_2.10 \ 1.1.0-SNAPSHOT \ ivys \的ivy.xml

特别是:这里是.m2中的一个文件:

C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT>dir

 Directory of C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT

06/26/2014  04:25 PM    <DIR>          .
06/26/2014  04:25 PM    <DIR>          ..
06/26/2014  04:25 PM         1,180,476 spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar
06/26/2014  04:24 PM           808,815 spark-core_2.10-1.1.0-SNAPSHOT-sources.jar
06/26/2014  02:27 PM         5,781,917 spark-core_2.10-1.1.0-SNAPSHOT.jar
06/26/2014  05:03 PM            13,436 spark-core_2.10-1.1.0-SNAPSHOT.pom

尝试消费客户端项目中的jar时会出现问题。

以下是客户端build.sbt的摘录:

val sparkVersion = "1.1.0-SNAPSHOT"
..
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.10" % sparkVersion  % "compile->default"  withSources(),
  "org.apache.spark" % "spark-sql_2.10" % sparkVersion  % "compile->default"  withSources()

..

resolvers  ++= Seq(
  "Apache repo" at "https://repository.apache.org/content/repositories/releases",
  "Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
  Resolver.mavenLocal
)

所以:我们有:

  • 一个很好的本地回购
  • 引用本地仓库的build.sbt

但是当我们这样做时:

sbt package

我们对刚刚发布的相同火花工件的依赖性得到了解决:

[info] Loading project definition from C:\apps\hspark\project
[info] Set current project to hspark (in build file:/C:/apps/hspark/)
[info] Updating {file:/C:/apps/hspark/}hspark...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
  [info] Resolving org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT ...
  [info] Resolving org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT ...
  [info] Resolving org.scala-lang#scala-compiler;2.10.4 ...
  [info] Resolving org.scala-lang#scala-reflect;2.10.4 ...
  [info] Resolving org.scala-lang#jline;2.10.4 ...
  [info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn]  :: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
        at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
        at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
..
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:744)
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[error] unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile

[
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHO

更新根据@lpiepiora的回答,似乎删除了compile-&gt;默认(令人惊讶)确实有所作为。到目前为止,这是证据。

(使用依赖图插件):

  

完成更新。 [info]默认:hspark_2.10:0.1.0-SNAPSHOT [S] [info]
  + -org.apache.spark:spark-core_2.10:1.1.0-SNAPSHOT [S]

1 个答案:

答案 0 :(得分:3)

尝试删除依赖项的映射compile->default。无论如何,它是多余的,正如文档所说:

  

没有映射的配置(没有“ - &gt;”)被映射到“默认”或   “编译”。 - &gt;仅在映射到不同时才需要   配置比那些。

因此,请按以下方式声明您的依赖项:

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.10" % sparkVersion withSources(),
  "org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()
)

他们应该解决。