如何为Spark 1.1.1应用程序创建WAR?

时间:2016-02-29 02:56:20

标签: java tomcat jar war spark-java

我使用Spark创建了不错的应用程序。现在我想将它部署到Apache Tomcat 7.我尝试使用Apache Ant构建WAR:

<target name="war">
  <war destfile="ROOT.war"
       webxml="web.xml">
     <classes dir="classes"/>
  </war>
</target>

我已将已编译的代码放入类文件夹中。在web.xml中,我放置了officially recommended code

<web-app>
    <filter>
        <filter-name>SparkFilter</filter-name>
        <filter-class>spark.servlet.SparkFilter</filter-class>
        <init-param>
            <param-name>applicationClass</param-name>
            <param-value>com.MyApplication</param-value>
        </init-param>
    </filter>
    <filter-mapping>
        <filter-name>SparkFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
</web-app>

然后我将ROOT.war放入webapps文件夹,重新启动Tomcat并且我的应用程序没有部署。

我检查了catalina.out

java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[]]
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:904)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:632)
    at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1073)
    at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1857)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

看起来不是很有用。然后我检查了localhost日志。

Feb 28, 2016 4:02:24 AM org.apache.catalina.core.StandardContext filterStart
SEVERE: Exception starting filter SparkFilter
java.lang.ClassNotFoundException: spark.servlet.SparkFilter
    at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1718)
    at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1569)
    at org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:529)
    at org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:511)
    at org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:139)
    at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:258)
    at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:105)
    at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4809)
    at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5485)
    at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:632)
    at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1073)
    at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1857)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

嗯,看起来类路径中没有一些Spark JAR。我添加了 /usr/share/tomcat7/lib/spark-core-1.1.1.jar

看起来那里没有spark.servlet.SparkFilter。你有什么想法我能找到这个罐子吗? 或者如果有人准备好部署WAR文件,你能与我分享吗?

我已经检查过/usr/share/tomcat7/lib/的jar文件是否被其他servlet使用。

1 个答案:

答案 0 :(得分:3)

我添加了一个如何使用maven和sparkjava构建战争的示例2.3。

以下是包含说明的存储库https://github.com/kliakos/sparkjava-war-example

步骤相当简单。它应该与新的Tomcat安装一起使用,否则可能是您遇到的一些与Tomcat相关的问题。