创建JavaSparkContext时的NoClassDefFoundError

时间:2016-01-30 17:58:15

标签: java apache-spark spring-boot

我有一个Spring Boot应用程序,我已经添加了Spark Core依赖项,因为我想在其中使用JavaSparkContext。

不幸的是,当我尝试初始化应用程序时,我得到了这个NoClassDefFoundException

Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.spark.api.java.JavaSparkContext]: Factory method 'javaSparkContext' threw exception; nested exception is java.lang.NoClassDefFoundError: com/sun/jersey/spi/container/servlet/ServletContainer
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:189) ~[spring-beans-4.2.3.RELEASE.jar:4.2.3.RELEASE]
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:588) ~[spring-beans-4.2.3.RELEASE.jar:4.2.3.RELEASE]
... 44 common frames omitted
Caused by: java.lang.NoClassDefFoundError: com/sun/jersey/spi/container/servlet/ServletContainer
at org.apache.spark.status.api.v1.ApiRootResource$.getServletHandler(ApiRootResource.scala:187) ~[spark-core_2.10-1.6.0.jar:1.6.0]
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:70) ~[spark-core_2.10-1.6.0.jar:1.6.0]
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:76) ~[spark-core_2.10-1.6.0.jar:1.6.0]
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:195) ~[spark-core_2.10-1.6.0.jar:1.6.0]
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:146) ~[spark-core_2.10-1.6.0.jar:1.6.0]
at org.apache.spark.SparkContext.<init>(SparkContext.scala:473) ~[spark-core_2.10-1.6.0.jar:1.6.0]
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59) ~[spark-core_2.10-1.6.0.jar:1.6.0]
at org.vferrer.sparkker.config.SparkConfig.javaSparkContext(SparkConfig.java:28) ~[classes/:na]
at org.vferrer.sparkker.config.SparkConfig$$EnhancerBySpringCGLIB$$b19a91e3.CGLIB$javaSparkContext$0(<generated>) ~[classes/:na]
at org.vferrer.sparkker.config.SparkConfig$$EnhancerBySpringCGLIB$$b19a91e3$$FastClassBySpringCGLIB$$a68aab86.invoke(<generated>) ~[classes/:na]
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228) ~[spring-core-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:318) ~[spring-context-4.2.3.RELEASE.jar:4.2.3.RELEASE]
at org.vferrer.sparkker.config.SparkConfig$$EnhancerBySpringCGLIB$$b19a91e3.javaSparkContext(<generated>) ~[classes/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_51]

这是我的pom.xml的相关部分:

<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>1.3.1.RELEASE</version>
    <relativePath /> <!-- lookup parent from repository -->
</parent>

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <java.version>1.8</java.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-actuator</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.cloud</groupId>
        <artifactId>spring-cloud-starter-eureka</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.cloud</groupId>
        <artifactId>spring-cloud-starter-feign</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-data-rest</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-hateoas</artifactId>
    </dependency>

    <!-- Spark -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.6.0</version>
        <exclusions>
            <!-- Introduced to fix a problem with log4j and slfj -->
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
            </exclusion>
            <exclusion>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
            </exclusion>
        </exclusions>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>
</dependencies>

<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-parent</artifactId>
            <version>Brixton.M3</version>
            <type>pom</type>
            <scope>import</scope>
        </dependency>
    </dependencies>
</dependencyManagement>

<build>
    <plugins>
        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
        </plugin>
    </plugins>
</build>

我已经尝试手动添加缺少的Jersey jar(核心和服务器)无济于事。有任何想法吗? 谢谢!

更新 正如@Davide所建议的,我添加了这个新的Jersey服务器依赖项:

<dependency>
    <groupId>com.sun.jersey</groupId>
    <artifactId>jersey-server</artifactId>
    <version>1.2</version>
</dependency>

不幸的是,在此过程中还有更多错误:

INFO 3300 --- [           main] o.s.c.n.eureka.InstanceInfoFactory       : Setting initial instance status as: STARTING
INFO 3300 --- [           main] c.n.d.provider.DiscoveryJerseyProvider   : Using encoding codec LegacyJacksonJson
INFO 3300 --- [           main] c.n.d.provider.DiscoveryJerseyProvider   : Using decoding codec LegacyJacksonJson
ERROR 3300 --- [           main] com.sun.jersey.spi.inject.Errors         : The following errors and warnings have been detected with resource and/or provider classes:
SEVERE: Missing dependency for field: javax.ws.rs.core.UriInfo com.sun.jersey.server.impl.template.ViewableMessageBodyWriter.ui
SEVERE: Missing dependency for field: com.sun.jersey.spi.template.TemplateContext com.sun.jersey.server.impl.template.ViewableMessageBodyWriter.tc

5 个答案:

答案 0 :(得分:2)

您似乎需要为球衣服务器

添加依赖项
<dependency>
    <groupId>com.sun.jersey</groupId>
    <artifactId>jersey-server</artifactId>
    <version>1.2</version>
</dependency>

答案 1 :(得分:1)

使用预先构建的Spark jar,它已包含所有依赖项。

你可以找到它here

答案 2 :(得分:0)

根据这里的答案stackoverflow.com/questions/18086218你需要使用泽西1,这需要用像

这样的@Davide Lorenzo MARINO依赖替换
<dependency>
     <groupId>com.sun.jersey</groupId>
     <artifactId>jersey-servlet</artifactId>
     <version>1.19</version>
</dependency>

或使用该答案中所述的1.17.1版本。

您可能需要此jersey repository

列表中的更多依赖项

答案 3 :(得分:0)

最后,我通过降级到Spark 1.3.1来运行它。 我尝试了这里给出的所有建议都无济于事。

    <!-- Spark -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.3.1</version>
        <exclusions>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
            </exclusion>
            <exclusion>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
            </exclusion>
        </exclusions>
    </dependency>

不过,我真的不喜欢这个解决方案。

答案 4 :(得分:0)

对于我的情况,我在模块的maven依赖项中有一个provided范围。

当尝试在本地运行该模块时,provided建议maven不要使用项目的类路径加载类,而是使用provided版本。

所以评论provided依赖关系解决问题