尝试在命令行上运行Java Spark应用程序时出现NoClassDefFoundError

时间:2019-10-17 16:10:41

标签: java apache-spark classnotfoundexception noclassdeffounderror deploying

我正在尝试在Java上使用spark运行应用程序,但是当我尝试mvn package; mvn exec:java时,我一直遇到NoClassDefFoundError。

当我尝试在PowerShell和Intellij中运行程序时,不断出现相同的错误(如下)。当我删除一堆Maven依赖项时,错误消失并且servlet在本地主机上运行。

错误消息:

Exception in thread "Thread-0" java.lang.NoClassDefFoundError: javax/servlet/http/HttpSessionIdListener
    at org.eclipse.jetty.server.session.SessionHandler.<clinit>(SessionHandler.java:140)
    at spark.embeddedserver.jetty.EmbeddedJettyFactory.create(EmbeddedJettyFactory.java:43)
    at spark.embeddedserver.EmbeddedServers.create(EmbeddedServers.java:65)
    at spark.Service.lambda$init$2(Service.java:497)
    at java.base/java.lang.Thread.run(Thread.java:835)
Caused by: java.lang.ClassNotFoundException: javax.servlet.http.HttpSessionIdListener
    at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
    at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
    at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
    ... 5 more
我从pom.xml中删除的

依赖项:

<dependency>
            <groupId>com.googlecode.json-simple</groupId>
            <artifactId>json-simple</artifactId>
            <version>1.1</version>
        </dependency>

        <dependency>
            <groupId>edu.stanford.nlp</groupId>
            <artifactId>stanford-corenlp</artifactId>
            <version>3.9.2</version>
        </dependency>

        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>8.0.13</version>
        </dependency>

        <dependency>
            <groupId>edu.stanford.nlp</groupId>
            <artifactId>stanford-corenlp</artifactId>
            <version>3.9.2</version>
            <classifier>models</classifier>
        </dependency>

该servlet仅具有spark依赖项即可正常运行:

        <dependency>
            <groupId>com.sparkjava</groupId>
            <artifactId>spark-core</artifactId>
            <version>2.6.0</version>
        </dependency>

但是,如果我添加更多,则会发生异常

1 个答案:

答案 0 :(得分:0)

已解决! sparkjava依赖项需要5个以上的函数依赖项,可以在这里找到:https://mvnrepository.com/artifact/com.sparkjava/spark-core/2.9.1