以下是我的代码。
抛出第一行代码的异常,在调试代码时出现错误提升。
我遗失了什么吗?
SparkConf sparkConf = new SparkConf().setAppName("JavaKafkaWordCount").setMaster("local[*]");
;
// Create the context with 2 seconds batch size
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(1000));
Map<String, Integer> topicMap = new HashMap<>();
topicMap.put("ParsedDataQueue", 1);
JavaPairReceiverInputDStream<String, String> messages = KafkaUtils.createStream(jssc, "9.32.165.247:2181", "5",
topicMap);
我在第一行本身时遇到错误提升!!。
16/11/15 06:10:04 INFO BlockManagerMaster: Registered BlockManager
16/11/15 06:10:05 ERROR MetricsSystem: Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantiated
16/11/15 06:10:05 ERROR SparkContext: Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:88)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:57)
at java.lang.reflect.Constructor.newInstance(Constructor.java:437)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:192)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:186)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:186)
at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:100)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:540)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
at spark.ScoreConsumer.main(ScoreConsumer.java:39)
Caused by: java.lang.NoSuchMethodError: com/fasterxml/jackson/databind/module/SimpleSerializers.<init>(Ljava/util/List;)V (loaded from file:/C:/Users/Administrator/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.0.0/jackson-databind-2.0.0.jar by sun.misc.Launcher$AppClassLoader@86b3226c) called from class com.codahale.metrics.json.MetricsModule (loaded from file:/C:/Users/Administrator/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.2/metrics-json-3.1.2.jar by sun.misc.Launcher$AppClassLoader@86b3226c).
at com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:223)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:49)
... 18 more
16/11/15 06:10:05 INFO SparkUI: Stopped Spark web UI at http://9.98.171.228:4040
16/11/15 06:10:05 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/11/15 06:10:05 INFO MemoryStore: MemoryStore cleared
16/11/15 06:10:05 INFO BlockManager: BlockManager stopped
16/11/15 06:10:05 INFO BlockManagerMaster: BlockManagerMaster stopped
16/11/15 06:10:05 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/11/15 06:10:05 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:88)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:57)
at java.lang.reflect.Constructor.newInstance(Constructor.java:437)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:192)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:186)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:186)
at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:100)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:540)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
at spark.ScoreConsumer.main(ScoreConsumer.java:39)
Caused by: java.lang.NoSuchMethodError: com/fasterxml/jackson/databind/module/SimpleSerializers.<init>(Ljava/util/List;)V (loaded from file:/C:/Users/Administrator/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.0.0/jackson-databind-2.0.0.jar by sun.misc.Launcher$AppClassLoader@86b3226c) called from class com.codahale.metrics.json.MetricsModule (loaded from file:/C:/Users/Administrator/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.2/metrics-json-3.1.2.jar by sun.misc.Launcher$AppClassLoader@86b3226c).
at com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:223)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:49)
... 18 more
在以下项目中是依赖
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-twitter_2.10</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.10.0.1</version>
</dependency>
<!-- <dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpmime</artifactId>
<version>4.0-alpha3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>com.liferay.portal</groupId>
<artifactId>util-java</artifactId>
<version>6.0.2</version>
</dependency>
<dependency>
<groupId>commons-httpclient</groupId>
<artifactId>commons-httpclient</artifactId>
<version>3.1</version>
</dependency>