我正在使用Spark 1.6.0,我正在尝试编写一个非常简单的“字数统计”项目。我收到了这个错误:
java.lang.NoClassDefFoundError:javax / servlet / FilterRegistration
这是我的代码:
import org.apache.spark.api.java.JavaSparkContext;
import scala.Tuple2;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import java.util.Arrays;
import org.apache.spark.SparkConf;
public class WordCount {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("WordCount").setMaster("local[2]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> lines = sc.textFile("scrittura.txt");
JavaRDD<Integer> lineLengths = lines.map(s -> s.length());
int totalLength = lineLengths.reduce((a, b) -> a + b);
System.out.println("TOTAL: " + totalLength);
JavaRDD<String> flat = lines
.flatMap(x -> Arrays.asList(x.replaceAll("[^A-Za-z ]", "").split(" ")));
JavaPairRDD<String, Integer> map = flat
.mapToPair(x -> new Tuple2<String, Integer>(x, 1));
JavaPairRDD<String, Integer> reduce = map
.reduceByKey((x, y) -> x + y);
System.out.println(reduce.collect());
sc.stop();
sc.close();
}}
这是我的日志:
线程“main”中的异常java.lang.NoClassDefFoundError: javax / servlet / FilterRegistration at org.spark-project.jetty.servlet.ServletContextHandler。(ServletContextHandler.java:136) 在 org.spark-project.jetty.servlet.ServletContextHandler。(ServletContextHandler.java:129) 在 org.spark-project.jetty.servlet.ServletContextHandler。(ServletContextHandler.java:98) 在 org.apache.spark.ui.JettyUtils $ .createServletHandler(JettyUtils.scala:110) 在 org.apache.spark.ui.JettyUtils $ .createServletHandler(JettyUtils.scala:101) 在org.apache.spark.ui.WebUI.attachPage(WebUI.scala:78)at org.apache.spark.ui.WebUI $$ anonfun $ attachTab $ 1.适用(WebUI.scala:62) 在 org.apache.spark.ui.WebUI $$ anonfun $ attachTab $ 1.适用(WebUI.scala:62) 在 scala.collection.mutable.ResizableArray $ class.foreach(ResizableArray.scala:59) 在scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) 在org.apache.spark.ui.WebUI.attachTab(WebUI.scala:62)at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:61)at org.apache.spark.ui.SparkUI。(SparkUI.scala:74)at org.apache.spark.ui.SparkUI $ .create(SparkUI.scala:190)at org.apache.spark.ui.SparkUI $ .createLiveUI(SparkUI.scala:141)at at org.apache.spark.SparkContext。(SparkContext.scala:466)at 。org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:61) 在WordCount.main(WordCount.java:16)引起: java.lang.ClassNotFoundException:javax.servlet.FilterRegistration at java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java:331)at at java.lang.ClassLoader.loadClass(ClassLoader.java:357)... 18更多
这是我的pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>examples</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>examples</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty.orbit</groupId>
<artifactId>javax.servlet</artifactId>
<version>3.0.0.v201112011016</version>
</dependency>
</dependencies>
</project>
我该如何解决?
谢谢!