sbt程序集jar

时间:2018-06-05 13:10:24

标签: scala jar sbt sbt-assembly

我有一个带有主类的KafkaProducer和SparkConsumer对象的应用程序。出于测试目的,我在单独的终端窗口中并行运行它们。但是,我想部署一个应用程序,这就是为什么我使用sbt-assembly创建了一个带有依赖项的jar文件。当我在Dockerfile中作为CMD运行打包的应用程序时,我收到以下错误消息:

Error: Could not find or load main class consumer.SparkConsumer

以下是我的应用程序启动的bash脚本:

#!/usr/bin/env bash
if [ "$1" = "consumer" ]
then
    java -cp "target/scala-2.11/demo_2.11.jar" consumer.SparkConsumer $2 $3 $4
elif [ "$1" = "producer" ]
then
    java -cp "target/scala-2.11/demo_2.11.jar" producer.KafkaProducer $5 $3 $6 $7
else
    echo "Wrong parameter. It should be consumer or producer"
fi

如何在build.sbt中声明2个主要类来打包单独的jar或者在jar中提供Main类的选择?(不确定哪个更好)

这是我的build.sbt文件的样子:

version := "0.1"

scalaVersion := "2.11.8"

assemblyJarName in assembly := "demo_2.11.jar"

val sparkVersion = "2.2.0"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.9.5"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.5"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.5"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
  "neo4j-contrib" % "neo4j-spark-connector" % "2.1.0-M4",
  "com.typesafe" % "config" % "1.3.0",
  "org.neo4j.driver" % "neo4j-java-driver" % "1.5.1",
  "com.opencsv" % "opencsv" % "4.1",
  "com.databricks" %% "spark-csv" % "1.5.0",
  "com.github.tototoshi" %% "scala-csv" % "1.3.5",
  "org.elasticsearch" %% "elasticsearch-spark-20" % "6.2.4"
)

assemblyMergeStrategy in assembly := {
  case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
  case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
  case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
  case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("com", "google", xs @ _*) => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
  case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
  case PathList("org", "slf4j", xs @ _*) => MergeStrategy.last
  case PathList("org", "neo4j", xs @ _*) => MergeStrategy.last
  case PathList("com", "typesafe", xs @ _*) => MergeStrategy.last
  case PathList("net", "jpountz", xs @ _*) => MergeStrategy.last
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case "about.html" => MergeStrategy.rename
  case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
  case "META-INF/mailcap" => MergeStrategy.last
  case "META-INF/mimetypes.default" => MergeStrategy.last
  case "plugin.properties" => MergeStrategy.last
  case "log4j.properties" => MergeStrategy.last
  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

0 个答案:

没有答案