jar文件是用sbt构建的吗?

时间:2017-12-11 18:10:08

标签: scala apache-spark sbt

我必须使用jar从我的scala代码构建sbt

sudo sbt package

它使用了我的构建文件:

name := "PSG CCD"
version := "1.0"
scalaVersion := "2.11.8"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
        "org.apache.spark" %% "spark-core" % "2.2.0",
        "org.apache.spark" %% "spark-sql" % "2.2.0",
        "org.apache.spark" %% "spark-streaming" % "2.2.0",
        "neo4j-contrib" % "neo4j-spark-connector" % "2.0.0-M2"
)

我也建立了jar就好了。然后,我通过FTP将其传输到我的spark服务器并执行spark-submit

 spark-submit --class "PSGApp" --master local[4] psg_ccd.jar 

我收到此错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org/neo4j/spark/Neo4j
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
    at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
    at java.lang.Class.getMethod0(Class.java:3018)
    at java.lang.Class.getMethod(Class.java:1784)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:739)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.neo4j.spark.Neo4j
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    [truncated, 10 more messages]

因此即使我在依赖项文件中有neo4j-spark-connector,但当我通过spark运行它时似乎找不到它。现在,spark运行在与jar上的scala构建不同的计算机上。这有关系吗?是否有一个lib文件夹我需要复制并删除某处?

我猜测neo4j spark连接依赖项的库信息没有内置到我试图运行的jar中。

也许我错过了一个强制转换的开关?

1 个答案:

答案 0 :(得分:5)

您需要使用sbt assembly插件来生成具有依赖项的胖jar。 Stackoverflow中有很多示例。我告诉你一个:How to build an Uber JAR (Fat JAR) using SBT within IntelliJ IDEA?