我是Apache Spark的新手。我正在尝试使用Spark执行简单的spring启动应用程序,但我得到了异常。
ERROR ApplicationMaster: User class threw exception:
java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
Caused by: java.lang.ClassNotFoundException: org.springframework.boot.SpringApplication
但是,我能够从Eclipse IDE中完整地执行这个项目。它正在执行我保留的灵魂。
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.3.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency> <!-- Spark -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
<dependency> <!-- Spark SQL -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.4.0</version>
</dependency>
<dependency> <!-- Spark SQL -->
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_2.10</artifactId>
<version>2.6.5</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
我的主要春季课程是
@SpringBootApplication
public class SparkS3Application {
public static void main(String[] args) {
SpringApplication.run(SparkS3Application.class, args);
System.out.println(" *************************** called *******************");
}
}
答案 0 :(得分:3)
我只使用--jars&#34; jar路径,另一个jar路径&#34;在我的spark submit命令中添加了所需的依赖项。您需要提供之后分隔的所有罐子逗号 --jars。
第二件事是尝试在spark 2.0中执行此操作我使用的是park 1.6并且我遇到了问题,但它与spark 2.0完全正常。
希望这会帮助你们..