在IntelliJ上构建jar时出现未解决的依赖项错误

时间:2018-07-20 18:05:53

标签: scala apache-spark intellij-idea

我正在尝试处理Spark-JDBC程序。为此,我在build.sbt文件中添加了以下依赖项:

name := "YearPartition"

version := "0.1"

scalaVersion := "2.11.8"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

当我在IntelliJ上点击“ Build Jar”时, enter image description here

它表示未解决的依赖性。我试图在我的家庭网络而不是公司网络上运行它。

"C:\Program Files\Java\jdk1.8.0_172\bin\java.exe" -Xms512M -Xmx1024M -Xss1M -XX:+CMSClassUnloadingEnabled -Dhttp.proxyHost=cis-india-pitc-bangalorez.proxy.corporate.ge.com -Dhttp.-Xms512M -Xmx1024M -Xss1M -XX:+CMSClassUnloadingEnabled -Dhttps.proxyHost=cis-india-pitc-bangalorez.proxy.corporate.ge.com -Dhttps.proxyPort=80 "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2018.1.4\lib\idea_rt.jar=62508:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2018.1.4\bin" -Dfile.encoding=windows-1252 -classpath C:\Users\212668640\.IdeaIC2018.1\config\plugins\Scala\launcher\sbt-launch.jar xsbt.boot.Boot clean compile package
[info] Loading settings from idea.sbt ...
[info] Loading global plugins from C:\Users\212668640\.sbt\1.0\plugins
[info] Loading project definition from C:\YearPartition\project
[info] Loading settings from build.sbt ...
[info] Set current project to YearPartition (in build file:/C:/YearPartition/)
[info] Executing in batch mode. For better performance use sbt's shell
[success] Total time: 0 s, completed Jul 20, 2018 11:26:26 PM
[info] Updating ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.11;2.2.0: public: unable to get resource for org/apache/spark#spark-core_2.11;2.2.0: res=https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11/2.2.0/spark-core_2.11-2.2.0.pom: java.net.UnknownHostException: cis-india-pitc-bangalorez.proxy.corporate.ge.com
[warn]  :: org.apache.spark#spark-sql_2.11;2.2.0: public: unable to get resource for org/apache/spark#spark-sql_2.11;2.2.0: res=https://repo1.maven.org/maven2/org/apache/spark/spark-sql_2.11/2.2.0/spark-sql_2.11-2.2.0.pom: java.net.UnknownHostException: cis-india-pitc-bangalorez.proxy.corporate.ge.com
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn] 
[warn]  Note: Unresolved dependencies path:
[warn]      org.apache.spark:spark-core_2.11:2.2.0 (C:\YearPartition\build.sbt#L8-9)
[warn]        +- default:yearpartition_2.11:0.1
[warn]      org.apache.spark:spark-sql_2.11:2.2.0 (C:\YearPartition\build.sbt#L11-12)
[warn]        +- default:yearpartition_2.11:0.1
[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11;2.2.0: public: unable to get resource for org/apache/spark#spark-core_2.11;2.2.0: res=https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11/2.2.0/spark-core_2.11-2.2.0.pom: java.net.UnknownHostException: cis-india-pitc-bangalorez.proxy.corporate.ge.com
[error] unresolved dependency: org.apache.spark#spark-sql_2.11;2.2.0: public: unable to get resource for org/apache/spark#spark-sql_2.11;2.2.0: res=https://repo1.maven.org/maven2/org/apache/spark/spark-sql_2.11/2.2.0/spark-sql_2.11-2.2.0.pom: java.net.UnknownHostException: cis-india-pitc-bangalorez.proxy.corporate.ge.com
[error]     at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:334)
[error]     at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:243)
[error]     at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error]     at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error]     at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error]     at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error]     at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error]     at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error]     at xsbt.boot.Using$.withResource(Using.scala:10)
[error]     at xsbt.boot.Using$.apply(Using.scala:9)
[error]     at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error]     at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error]     at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error]     at xsbt.boot.Locks$.apply(Locks.scala:28)
[error]     at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:242)
[error]     at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error]     at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error]     at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error]     at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:46)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:99)
[error]     at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:112)
[error]     at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:112)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:95)
[error]     at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error]     at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:126)
[error]     at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2398)
[error]     at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error]     at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error]     at sbt.std.Transform$$anon$4.work(System.scala:66)
[error]     at sbt.Execute.$anonfun$submit$2(Execute.scala:263)
[error]     at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error]     at sbt.Execute.work(Execute.scala:272)
[error]     at sbt.Execute.$anonfun$submit$1(Execute.scala:263)
[error]     at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error]     at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error]     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error]     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error]     at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11;2.2.0: public: unable to get resource for org/apache/spark#spark-core_2.11;2.2.0: res=https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11/2.2.0/spark-core_2.11-2.2.0.pom: java.net.UnknownHostException: cis-india-pitc-bangalorez.proxy.corporate.ge.com
[error] unresolved dependency: org.apache.spark#spark-sql_2.11;2.2.0: public: unable to get resource for org/apache/spark#spark-sql_2.11;2.2.0: res=https://repo1.maven.org/maven2/org/apache/spark/spark-sql_2.11/2.2.0/spark-sql_2.11-2.2.0.pom: java.net.UnknownHostException: cis-india-pitc-bangalorez.proxy.corporate.ge.com
[error] Total time: 2 s, completed Jul 20, 2018 11:26:27 PM

Process finished with exit code 1

Scala(2.11.8)和spark(2.2.0)的版本与我们项目中使用的版本相同。因此,我使用这些版本创建了该项目。 谁能告诉我该如何解决?

0 个答案:

没有答案