我正在尝试使用Spark Java框架在Cloudbees上部署最简单的应用程序。这会生成我试图通过Jenkins push->部署部署的Jar文件,但它警告我,部署插件无法部署jar文件......
无论如何,我通过CloudBees SDK和CLI部署了我的jar:
bees app:deploy -t java -R java_version=1.7 target\myapp-with-dependencies.jar
然后它告诉我该应用程序已部署到我的URL。但是当我尝试访问此URL时,我收到了502 Bad Gateway Error ...
但是,无论是通过IntelliJ还是使用maven生成的Jar文件运行我的主类,URL 127.0.0.1:8080都会返回预期的Hello Spark
。
这是我的主要课程:
public class HelloSpark {
public static void main(String[] args) {
String port = System.getProperty("app.port","8080");
//So that the port is the one used by CloudBees
Spark.setPort(Integer.parseInt(port));
Spark.get(new Route("/") {
@Override
public Object handle(Request request, Response response) {
return "Hello Spark";
}
});
}
}
这是我的pom文件:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>spark-from-scratch</groupId>
<artifactId>spark-from-scratch</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>spark</groupId>
<artifactId>spark</artifactId>
<version>0.9.9.4-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>HelloSpark</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
答案 0 :(得分:1)
我认为你需要以这种方式指定你的主类:
bees app:deploy -t java -R class = your.main.Class -R java_version = 1.7 PATH_TO_APP_PACKAGE
答案 1 :(得分:1)
我认为有必要在你的命令中指定它,你可以在这里阅读更多内容: