Spark streaming- JavaNetworkWordCount示例错误

时间:2018-03-02 08:53:31

标签: spark-streaming

我是Spark Streaming的新手我正在运行基本的JavaNetworkWordCount示例,在我下载spark的文件夹中使用以下命令:

1)nc -l -p 9999

2)bin / run-example org.apache.spark.examples.streaming.JavaNetworkWordCount localhost 9999

我正在使用spark-2.3.0-bin-hadoop2.7

我收到以下错误:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/Users/aishwaryapatil/spark-2.3.0-bin-hadoop2.7/jars/hadoop-auth-2.7.3.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2018-03-02 00:47:33 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-03-02 00:47:33 INFO  SparkContext:54 - Running Spark version 2.3.0
2018-03-02 00:47:33 INFO  SparkContext:54 - Submitted application: JavaNetworkWordCount
2018-03-02 00:47:33 INFO  SecurityManager:54 - Changing view acls to: aishwaryapatil
2018-03-02 00:47:33 INFO  SecurityManager:54 - Changing modify acls to: aishwaryapatil
2018-03-02 00:47:33 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-03-02 00:47:33 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-03-02 00:47:33 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(aishwaryapatil); groups with view permissions: Set(); users  with modify permissions: Set(aishwaryapatil); groups with modify permissions: Set()
2018-03-02 00:47:33 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 63967.
2018-03-02 00:47:33 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-03-02 00:47:33 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-03-02 00:47:33 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-03-02 00:47:33 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-03-02 00:47:33 INFO  DiskBlockManager:54 - Created local directory at /private/var/folders/_4/gljqqgjs34g4ql_2x5kzjrp40000gn/T/blockmgr-4a864070-5398-4f8c-ab1e-f21ecb288705
2018-03-02 00:47:33 INFO  MemoryStore:54 - MemoryStore started with capacity 434.4 MB
2018-03-02 00:47:33 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-03-02 00:47:34 INFO  log:192 - Logging initialized @6948ms
2018-03-02 00:47:34 INFO  Server:346 - jetty-9.3.z-SNAPSHOT
2018-03-02 00:47:34 INFO  Server:414 - Started @7003ms
2018-03-02 00:47:34 INFO  AbstractConnector:278 - Started ServerConnector@597f0937{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-03-02 00:47:34 INFO  Utils:54 - Successfully started service 'SparkUI' on port 4040.
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@34aeacd1{/jobs,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@288f173f{/jobs/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@a22c4d8{/jobs/job,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@35e26d05{/jobs/job/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@29fa6b65{/stages,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7c72ecc{/stages/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@47406941{/stages/stage,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@656922a0{/stages/stage/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@44784e2f{/stages/pool,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2440022a{/stages/pool/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@737db7f8{/storage,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f2de715{/storage/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5922d3e9{/storage/rdd,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7d57dbb5{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@630b6190{/environment,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@532e27ab{/environment/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1cdc1bbc{/executors,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f95f1e1{/executors/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@672a1c62{/executors/threadDump,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@459b6c53{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@39e69ea7{/static,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6dab01d9{/,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2e09c51{/api,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@31e76a8d{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@770beef5{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://10.0.0.77:4040
2018-03-02 00:47:34 INFO  SparkContext:54 - Added JAR file:///Users/aishwaryapatil/spark-2.3.0-bin-hadoop2.7/examples/jars/scopt_2.11-3.7.0.jar at spark://10.0.0.77:63967/jars/scopt_2.11-3.7.0.jar with timestamp 1519980454213
2018-03-02 00:47:34 INFO  SparkContext:54 - Added JAR file:///Users/aishwaryapatil/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar at spark://10.0.0.77:63967/jars/spark-examples_2.11-2.3.0.jar with timestamp 1519980454214
2018-03-02 00:47:34 INFO  Executor:54 - Starting executor ID driver on host localhost
2018-03-02 00:47:34 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 63968.
2018-03-02 00:47:34 INFO  NettyBlockTransferService:54 - Server created on 10.0.0.77:63968
2018-03-02 00:47:34 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-03-02 00:47:34 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 10.0.0.77, 63968, None)
2018-03-02 00:47:34 INFO  BlockManagerMasterEndpoint:54 - Registering block manager 10.0.0.77:63968 with 434.4 MB RAM, BlockManagerId(driver, 10.0.0.77, 63968, None)
2018-03-02 00:47:34 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 10.0.0.77, 63968, None)
2018-03-02 00:47:34 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 10.0.0.77, 63968, None)
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@76bf1bb8{/metrics/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ReceiverTracker:54 - Starting 1 receivers
2018-03-02 00:47:34 INFO  ReceiverTracker:54 - ReceiverTracker started
2018-03-02 00:47:34 INFO  SocketInputDStream:54 - Slide time = 1000 ms
2018-03-02 00:47:34 INFO  SocketInputDStream:54 - Storage level = Serialized 1x Replicated
2018-03-02 00:47:34 INFO  SocketInputDStream:54 - Checkpoint interval = null
2018-03-02 00:47:34 INFO  SocketInputDStream:54 - Remember interval = 1000 ms
2018-03-02 00:47:34 INFO  SocketInputDStream:54 - Initialized and validated org.apache.spark.streaming.dstream.SocketInputDStream@1caa0603
2018-03-02 00:47:34 INFO  FlatMappedDStream:54 - Slide time = 1000 ms
2018-03-02 00:47:34 INFO  FlatMappedDStream:54 - Storage level = Serialized 1x Replicated
2018-03-02 00:47:34 INFO  FlatMappedDStream:54 - Checkpoint interval = null
2018-03-02 00:47:34 INFO  FlatMappedDStream:54 - Remember interval = 1000 ms
2018-03-02 00:47:34 INFO  FlatMappedDStream:54 - Initialized and validated org.apache.spark.streaming.dstream.FlatMappedDStream@447675
2018-03-02 00:47:34 INFO  MappedDStream:54 - Slide time = 1000 ms
2018-03-02 00:47:34 INFO  MappedDStream:54 - Storage level = Serialized 1x Replicated
2018-03-02 00:47:34 INFO  MappedDStream:54 - Checkpoint interval = null
2018-03-02 00:47:34 INFO  MappedDStream:54 - Remember interval = 1000 ms
2018-03-02 00:47:34 INFO  MappedDStream:54 - Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@31949b5e
2018-03-02 00:47:34 INFO  ShuffledDStream:54 - Slide time = 1000 ms
2018-03-02 00:47:34 INFO  ShuffledDStream:54 - Storage level = Serialized 1x Replicated
2018-03-02 00:47:34 INFO  ShuffledDStream:54 - Checkpoint interval = null
2018-03-02 00:47:34 INFO  ShuffledDStream:54 - Remember interval = 1000 ms
2018-03-02 00:47:34 INFO  ShuffledDStream:54 - Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@79bb7f43
2018-03-02 00:47:34 INFO  ForEachDStream:54 - Slide time = 1000 ms
2018-03-02 00:47:34 INFO  ForEachDStream:54 - Storage level = Serialized 1x Replicated
2018-03-02 00:47:34 INFO  ForEachDStream:54 - Checkpoint interval = null
2018-03-02 00:47:34 INFO  ForEachDStream:54 - Remember interval = 1000 ms
2018-03-02 00:47:34 INFO  ForEachDStream:54 - Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@35edf469
2018-03-02 00:47:34 INFO  RecurringTimer:54 - Started timer for JobGenerator at time 1519980455000
2018-03-02 00:47:34 INFO  JobGenerator:54 - Started JobGenerator at 1519980455000 ms
2018-03-02 00:47:34 INFO  JobScheduler:54 - Started JobScheduler
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@26457986{/streaming,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2dff7085{/streaming/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@b30a50d{/streaming/batch,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@764a3e5d{/streaming/batch/json,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  ReceiverTracker:54 - Receiver 0 started
2018-03-02 00:47:34 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@53a50b0a{/static/streaming,null,AVAILABLE,@Spark}
2018-03-02 00:47:34 INFO  StreamingContext:54 - StreamingContext started
2018-03-02 00:47:34 INFO  DAGScheduler:54 - Got job 0 (start at JavaNetworkWordCount.java:70) with 1 output partitions
2018-03-02 00:47:34 INFO  DAGScheduler:54 - Final stage: ResultStage 0 (start at JavaNetworkWordCount.java:70)
2018-03-02 00:47:34 INFO  DAGScheduler:54 - Parents of final stage: List()
2018-03-02 00:47:34 INFO  DAGScheduler:54 - Missing parents: List()
2018-03-02 00:47:34 INFO  DAGScheduler:54 - Submitting ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:613), which has no missing parents
2018-03-02 00:47:34 INFO  MemoryStore:54 - Block broadcast_0 stored as values in memory (estimated size 68.7 KB, free 434.3 MB)
2018-03-02 00:47:34 INFO  MemoryStore:54 - Block broadcast_0_piece0 stored as bytes in memory (estimated size 24.1 KB, free 434.3 MB)
2018-03-02 00:47:34 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in memory on 10.0.0.77:63968 (size: 24.1 KB, free: 434.4 MB)
2018-03-02 00:47:34 INFO  SparkContext:54 - Created broadcast 0 from broadcast at DAGScheduler.scala:1039
2018-03-02 00:47:34 INFO  DAGScheduler:54 - Submitting 1 missing tasks from ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:613) (first 15 tasks are for partitions Vector(0))
2018-03-02 00:47:34 INFO  TaskSchedulerImpl:54 - Adding task set 0.0 with 1 tasks
2018-03-02 00:47:35 INFO  JobScheduler:54 - Added jobs for time 1519980455000 ms
2018-03-02 00:47:35 INFO  JobScheduler:54 - Starting job streaming job 1519980455000 ms.0 from job set of time 1519980455000 ms
2018-03-02 00:47:35 INFO  TaskSetManager:54 - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8442 bytes)
2018-03-02 00:47:35 INFO  JobScheduler:54 - Finished job streaming job 1519980455000 ms.0 from job set of time 1519980455000 ms
2018-03-02 00:47:35 ERROR JobScheduler:91 - Error running job streaming job 1519980455000 ms.0
java.lang.IllegalArgumentException
    at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
    at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
    at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
    at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
    at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
    at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
    at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
    at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1358)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
    at org.apache.spark.rdd.RDD.take(RDD.scala:1331)
    at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
    at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
    at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
    at java.base/java.lang.Thread.run(Thread.java:844)
Exception in thread "main" java.lang.IllegalArgumentException
    at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
    at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
    at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
    at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
    at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
    at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
    at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
    at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
    at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1358)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
    at org.apache.spark.rdd.RDD.take(RDD.scala:1331)
    at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
    at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
    at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
    at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
    at scala.util.Try$.apply(Try.scala:192)
    at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
    at java.base/java.lang.Thread.run(Thread.java:844)
2018-03-02 00:47:35 INFO  Executor:54 - Running task 0.0 in stage 0.0 (TID 0)
2018-03-02 00:47:35 INFO  StreamingContext:54 - Invoking stop(stopGracefully=false) from shutdown hook
2018-03-02 00:47:35 INFO  Executor:54 - Fetching spark://10.0.0.77:63967/jars/scopt_2.11-3.7.0.jar with timestamp 1519980454213
2018-03-02 00:47:35 INFO  ReceiverTracker:54 - Sent stop signal to all 1 receivers
2018-03-02 00:47:35 INFO  TransportClientFactory:267 - Successfully created connection to /10.0.0.77:63967 after 26 ms (0 ms spent in bootstraps)
2018-03-02 00:47:35 INFO  Utils:54 - Fetching spark://10.0.0.77:63967/jars/scopt_2.11-3.7.0.jar to /private/var/folders/_4/gljqqgjs34g4ql_2x5kzjrp40000gn/T/spark-ad3f998e-683a-4a7b-a278-d1edb5c718d7/userFiles-9df1ab76-e37d-43c5-8712-7a4ed5093086/fetchFileTemp9254663516993947796.tmp
2018-03-02 00:47:35 INFO  Executor:54 - Adding file:/private/var/folders/_4/gljqqgjs34g4ql_2x5kzjrp40000gn/T/spark-ad3f998e-683a-4a7b-a278-d1edb5c718d7/userFiles-9df1ab76-e37d-43c5-8712-7a4ed5093086/scopt_2.11-3.7.0.jar to class loader
2018-03-02 00:47:35 INFO  Executor:54 - Fetching spark://10.0.0.77:63967/jars/spark-examples_2.11-2.3.0.jar with timestamp 1519980454214
2018-03-02 00:47:35 INFO  Utils:54 - Fetching spark://10.0.0.77:63967/jars/spark-examples_2.11-2.3.0.jar to /private/var/folders/_4/gljqqgjs34g4ql_2x5kzjrp40000gn/T/spark-ad3f998e-683a-4a7b-a278-d1edb5c718d7/userFiles-9df1ab76-e37d-43c5-8712-7a4ed5093086/fetchFileTemp17462613953619474919.tmp
2018-03-02 00:47:35 INFO  Executor:54 - Adding file:/private/var/folders/_4/gljqqgjs34g4ql_2x5kzjrp40000gn/T/spark-ad3f998e-683a-4a7b-a278-d1edb5c718d7/userFiles-9df1ab76-e37d-43c5-8712-7a4ed5093086/spark-examples_2.11-2.3.0.jar to class loader
2018-03-02 00:47:35 INFO  RecurringTimer:54 - Started timer for BlockGenerator at time 1519980455400
2018-03-02 00:47:35 INFO  BlockGenerator:54 - Started BlockGenerator
2018-03-02 00:47:35 INFO  BlockGenerator:54 - Started block pushing thread
2018-03-02 00:47:35 INFO  ReceiverSupervisorImpl:54 - Stopping receiver with message: Registered unsuccessfully because Driver refused to start receiver 0: 
2018-03-02 00:47:35 WARN  ReceiverSupervisorImpl:66 - Skip stopping receiver because it has not yet stared
2018-03-02 00:47:35 INFO  BlockGenerator:54 - Stopping BlockGenerator
2018-03-02 00:47:35 INFO  RecurringTimer:54 - Stopped timer for BlockGenerator after time 1519980455600
2018-03-02 00:47:35 INFO  BlockGenerator:54 - Waiting for block pushing thread to terminate
2018-03-02 00:47:35 INFO  BlockGenerator:54 - Pushing out the last 0 blocks
2018-03-02 00:47:35 INFO  BlockGenerator:54 - Stopped block pushing thread
2018-03-02 00:47:35 INFO  BlockGenerator:54 - Stopped BlockGenerator
2018-03-02 00:47:35 INFO  ReceiverSupervisorImpl:54 - Waiting for receiver to be stopped
2018-03-02 00:47:35 INFO  ReceiverSupervisorImpl:54 - Stopped receiver without error
2018-03-02 00:47:35 INFO  Executor:54 - Finished task 0.0 in stage 0.0 (TID 0). 751 bytes result sent to driver
2018-03-02 00:47:35 INFO  TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 611 ms on localhost (executor driver) (1/1)
2018-03-02 00:47:35 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool 
2018-03-02 00:47:35 INFO  DAGScheduler:54 - ResultStage 0 (start at JavaNetworkWordCount.java:70) finished in 0.849 s
2018-03-02 00:47:35 INFO  ReceiverTracker:54 - All of the receivers have deregistered successfully
2018-03-02 00:47:35 INFO  ReceiverTracker:54 - ReceiverTracker stopped
2018-03-02 00:47:35 INFO  JobGenerator:54 - Stopping JobGenerator immediately
2018-03-02 00:47:35 INFO  RecurringTimer:54 - Stopped timer for JobGenerator after time 1519980455000
2018-03-02 00:47:35 INFO  JobGenerator:54 - Stopped JobGenerator
2018-03-02 00:47:35 INFO  JobScheduler:54 - Stopped JobScheduler
2018-03-02 00:47:35 INFO  ContextHandler:910 - Stopped o.s.j.s.ServletContextHandler@26457986{/streaming,null,UNAVAILABLE,@Spark}
2018-03-02 00:47:35 INFO  ContextHandler:910 - Stopped o.s.j.s.ServletContextHandler@b30a50d{/streaming/batch,null,UNAVAILABLE,@Spark}
2018-03-02 00:47:35 INFO  ContextHandler:910 - Stopped o.s.j.s.ServletContextHandler@53a50b0a{/static/streaming,null,UNAVAILABLE,@Spark}
2018-03-02 00:47:35 INFO  StreamingContext:54 - StreamingContext stopped successfully
2018-03-02 00:47:35 INFO  SparkContext:54 - Invoking stop() from shutdown hook
2018-03-02 00:47:35 INFO  AbstractConnector:318 - Stopped Spark@597f0937{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-03-02 00:47:35 INFO  SparkUI:54 - Stopped Spark web UI at http://10.0.0.77:4040
2018-03-02 00:47:35 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-03-02 00:47:35 INFO  MemoryStore:54 - MemoryStore cleared
2018-03-02 00:47:35 INFO  BlockManager:54 - BlockManager stopped
2018-03-02 00:47:35 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-03-02 00:47:35 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-03-02 00:47:35 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-03-02 00:47:35 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-03-02 00:47:35 INFO  ShutdownHookManager:54 - Deleting directory /private/var/folders/_4/gljqqgjs34g4ql_2x5kzjrp40000gn/T/spark-a271be78-f18d-4e7d-a5e9-61c53c824501
2018-03-02 00:47:35 INFO  ShutdownHookManager:54 - Deleting directory /private/var/folders/_4/gljqqgjs34g4ql_2x5kzjrp40000gn/T/spark-ad3f998e-683a-4a7b-a278-d1edb5c718d7

Can someone please help me?

1 个答案:

答案 0 :(得分:0)

它可能与JDK版本有关。我在使用JDK 10的Mac上遇到了同样的问题。当我切换回JDK 8时,问题就消失了。