如何在Spring Boot Gradle中指定Launcher?

时间:2016-03-23 01:34:06

标签: gradle spring-boot launcher

Spring Boot中有三个启动器:JarLauncher / PropertiesLauncher / WarLauncher。 对于可执行jar,默认情况下将使用JarLauncher。现在我想使用PropertiesLauncher,以便我可以使用外部类路径。我怎么能指定那是spring boot gradle插件?

根据此文档D.3.1 Launcher manifest的D3.1,我可以像这样在MANIFEST.MF中指定主类:

Main-Class: org.springframework.boot.loader.JarLauncher
Start-Class: com.mycompany.project.MyApplication

但是,在Spring Boot Gradle中,以下代码片段实际指定了Start-Class,而不是Main-Class:

springBoot {
    mainClass = "com.sybercare.HealthServiceApplication"
}

我是否需要手动创建MANIFIEST.MF,或者有更好的方法来执行此操作?

谢谢!

2 个答案:

答案 0 :(得分:6)

添加16/03/24 20:16:48 INFO Worker: Asked to launch executor app-20160324201648-0011/0 for stream test 16/03/24 20:16:48 INFO SecurityManager: Changing view acls to: root 16/03/24 20:16:48 INFO SecurityManager: Changing modify acls to: root 16/03/24 20:16:48 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 16/03/24 20:16:48 INFO ExecutorRunner: Launch command: "/usr/java/jdk1.8.0_73/jre/bin/java" "-cp" "/opt/spark-1.5.2-bin-hadoop2.6/sbin/../conf/:/opt/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar:/opt/spark-1.5.2-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/opt/spark-1.5.2-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/opt/spark-1.5.2-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/etc/hadoop" "-Xms1024M" "-Xmx1024M" "-Dspark.driver.port=40243" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "akka.tcp://sparkDriver@192.168.111.201:40243/user/CoarseGrainedScheduler" "--executor-id" "0" "--hostname" "192.168.111.202" "--cores" "1" "--app-id" "app-20160324201648-0011" "--worker-url" "akka.tcp://sparkWorker@192.168.111.202:53363/user/Worker" 16/03/24 20:16:54 INFO Worker: Asked to kill executor app-20160324201648-0011/0 16/03/24 20:16:54 INFO ExecutorRunner: Runner thread for executor app-20160324201648-0011/0 interrupted 16/03/24 20:16:54 INFO ExecutorRunner: Killing process! 16/03/24 20:16:54 ERROR FileAppender: Error writing stream to file /opt/spark-1.5.2-bin-hadoop2.6/work/app-20160324201648-0011/0/stderr java.io.IOException: Stream closed at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170) at java.io.BufferedInputStream.read1(BufferedInputStream.java:283) at java.io.BufferedInputStream.read(BufferedInputStream.java:345) at java.io.FilterInputStream.read(FilterInputStream.java:107) at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70) at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39) at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39) at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699) at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38) 16/03/24 20:16:54 INFO Worker: Executor app-20160324201648-0011/0 finished with state KILLED exitStatus 143 16/03/24 20:16:54 INFO Worker: Cleaning up local directories for application app-20160324201648-0011 16/03/24 20:16:54 INFO ExternalShuffleBlockResolver: Application app-20160324201648-0011 removed, cleanupLocalDirs = true 属性:

layout

springBoot{ mainClass = "com.sybercare.HealthServiceApplication" layout = "ZIP" } 触发SpringBoot使用layout = ZIP

答案 1 :(得分:1)

layout属性默认为基于存档类型(jar或war)的猜测。对于PropertiesLauncher,布局为ZIP(即使输出可能是jar文件)。

https://docs.spring.io/autorepo/docs/spring-boot/1.2.0.M2/maven-plugin/usage.html