如何在Spark Streaming程序中解决“错误运行作业流作业”?

时间:2019-08-01 10:20:31

标签: java apache-spark spark-streaming

我在Spark上创建了一个Master和一个Worker。然后,我创建了一个Spark流作业,并尝试提交它,但是在Master上,它显示了很长的Java错误列表

使用此命令启动主服务器:

  • spark-class org.apache.spark.deploy.master.Master

使用它来启动工作程序:

  • spark-class org.apache.spark.deploy.worker.Worker spark:// ip:port

要提交火花作业,请尝试使用具有不同参数的命令:

  • spark-submit --class com.rba.boston.SparkHome RBA-jar-with-dependencies.jar

  • spark-submit --class com.rba.boston.SparkHome --master spark://10.220.45.105:7077 --driver-memory 2G --driver-cores 2 --conf spark.driver。 port = 9998 --executor-memory 2G --deploy-mode cluster --total-executor-cores 4 RBA-jar-with-dependencies.jar

  • spark-submit --class com.rba.boston.SparkHome --master spark://10.220.45.105:7077 --driver-memory 2G --driver-cores 2 --conf spark.driver。 port = 9997 --conf spark.driver.host = 10.220.45.105 --executor-memory 2G --deploy-mode client --total-executor-cores 4 RBA-jar-with-dependencies.jar

    public static void main(String[] args) throws InterruptedException 
    {
       SparkConf conf = new 
       SparkConf().setMaster("local[3]").setAppName("NetworkWordCount");
       JavaStreamingContext jssc = new JavaStreamingContext(conf, 
       Durations.seconds(3));
       JavaReceiverInputDStream<String> 
       lines=jssc.socketTextStream("10.220.45.105", 9998);
       JavaDStream<String> words = lines.flatMap(x -> 
       Arrays.asList(x.split(" ")).iterator());
       JavaPairDStream<String, Integer> pairs = words.mapToPair(s -> new 
       Tuple2<>(s, 1));
       JavaPairDStream<String, Integer> wordCounts = 
       pairs.reduceByKey((i1, i2) -> i1 + i2);
       wordCounts.print();
       jssc.start(); 
       jssc.awaitTermination();   // Wait for the computation to terminate
    }
    
Logs after Submitting the job are:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Data/Softwares/spark-2.4.3-bin-hadoop2.6/jars/spark-unsafe_2.11-2.4.3.jar) to method java.nio.Bits.unaligned()
        WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
      WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
    WARNING: All illegal access operations will be denied in a future release
    SparkHome
    19/08/01 15:16:06 INFO SparkContext: Running Spark version 2.4.3
    19/08/01 15:16:06 INFO SparkContext: Submitted application: NetworkWordCount
    19/08/01 15:16:06 INFO SecurityManager: Changing view acls to: SachdeJ
    19/08/01 15:16:06 INFO SecurityManager: Changing modify acls to: SachdeJ
    19/08/01 15:16:06 INFO SecurityManager: Changing view acls groups to:
    19/08/01 15:16:06 INFO SecurityManager: Changing modify acls groups to:
    19/08/01 15:16:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(SachdeJ); groups with view permissions: Set(); users  with modify permissions: Set(SachdeJ); groups with modify permissions: Set()
    19/08/01 15:16:07 INFO Utils: Successfully started service 'sparkDriver' on port 54035.
    19/08/01 15:16:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
    19/08/01 15:16:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
    19/08/01 15:16:07 INFO DiskBlockManager: Created local directory at C:\Users\sachdej\AppData\Local\Temp\blockmgr-cccaa7a1-cfdc-45fb-8ff5-9a800def11ff
    19/08/01 15:16:07 INFO MemoryStore: MemoryStore started with capacity 434.4 MB
    19/08/01 15:16:08 INFO Utils: Successfully started service 'SparkUI' on port 4040.
    19/08/01 15:16:08 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://GGN1037742.bsci.bossci.com:4040
    19/08/01 15:16:08 INFO SparkContext: Added JAR file:/C:/Data/SharkTank/com.rba.boston/target/RBA-jar-with-dependencies.jar at spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar with timestamp 1564652768180
    19/08/01 15:16:08 INFO Executor: Starting executor ID driver on host localhost
    19/08/01 15:16:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54057.
    19/08/01 15:16:08 INFO NettyBlockTransferService: Server created on GGN1037742.bsci.bossci.com:54057
    19/08/01 15:16:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
    19/08/01 15:16:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO BlockManagerMasterEndpoint: Registering block manager GGN1037742.bsci.bossci.com:54057 with 434.4 MB RAM, BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, GGN1037742.bsci.bossci.com, 54057, None)
    19/08/01 15:16:08 INFO ReceiverTracker: Starting 1 receivers
    19/08/01 15:16:08 INFO ReceiverTracker: ReceiverTracker started
    19/08/01 15:16:08 INFO SocketInputDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO SocketInputDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO SocketInputDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO SocketInputDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO SocketInputDStream: Initialized and validated org.apache.spark.streaming.dstream.SocketInputDStream@628a467a
    19/08/01 15:16:08 INFO FlatMappedDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO FlatMappedDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO FlatMappedDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO FlatMappedDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO FlatMappedDStream: Initialized and validated org.apache.spark.streaming.dstream.FlatMappedDStream@4013fae0
    19/08/01 15:16:08 INFO MappedDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO MappedDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO MappedDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO MappedDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@2e5a119c
    19/08/01 15:16:08 INFO ShuffledDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO ShuffledDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO ShuffledDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO ShuffledDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO ShuffledDStream: Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@2141cddb
    19/08/01 15:16:08 INFO ForEachDStream: Slide time = 3000 ms
    19/08/01 15:16:08 INFO ForEachDStream: Storage level = Serialized 1x Replicated
    19/08/01 15:16:08 INFO ForEachDStream: Checkpoint interval = null
    19/08/01 15:16:08 INFO ForEachDStream: Remember interval = 3000 ms
    19/08/01 15:16:08 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@1148f61f
    19/08/01 15:16:08 INFO RecurringTimer: Started timer for JobGenerator at time 1564652769000
    19/08/01 15:16:08 INFO JobGenerator: Started JobGenerator at 1564652769000 ms
    19/08/01 15:16:08 INFO JobScheduler: Started JobScheduler
    19/08/01 15:16:08 INFO ReceiverTracker: Receiver 0 started
    19/08/01 15:16:08 INFO StreamingContext: StreamingContext started
    19/08/01 15:16:08 INFO DAGScheduler: Got job 0 (start at SparkHome.java:40) with 1 output partitions
    19/08/01 15:16:08 INFO DAGScheduler: Final stage: ResultStage 0 (start at SparkHome.java:40)
    19/08/01 15:16:08 INFO DAGScheduler: Parents of final stage: List()
    19/08/01 15:16:08 INFO DAGScheduler: Missing parents: List()
    19/08/01 15:16:09 INFO DAGScheduler: Submitting ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:614), which has no missing parents
    19/08/01 15:16:09 INFO JobScheduler: Added jobs for time 1564652769000 ms
    19/08/01 15:16:09 INFO JobScheduler: Starting job streaming job 1564652769000 ms.0 from job set of time 1564652769000 ms
    19/08/01 15:16:09 INFO JobScheduler: Finished job streaming job 1564652769000 ms.0 from job set of time 1564652769000 ms
    19/08/01 15:16:09 ERROR JobScheduler: Error running job streaming job 1564652769000 ms.0
    java.lang.IllegalArgumentException: Unsupported class file major version 56
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
            at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
            at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
            at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
            at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
            at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
            at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
            at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
            at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
            at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
            at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
            at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
            at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
            at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
            at scala.collection.immutable.List.foreach(List.scala:392)
            at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
            at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
            at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
            at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
            at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
            at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
            at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
            at scala.util.Try$.apply(Try.scala:192)
            at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
            at java.base/java.lang.Thread.run(Thread.java:835)
    Exception in thread "main" java.lang.IllegalArgumentException: Unsupported class file major version 56
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
            at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
            at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
            at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
            at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
            at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
            at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
            at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
            at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
            at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
            at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
            at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
            at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
            at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
            at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
            at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
            at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
            at scala.collection.immutable.List.foreach(List.scala:392)
            at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
            at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
            at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
            at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
            at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
            at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
            at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
            at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:735)
            at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$3$1.apply(DStream.scala:734)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
            at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
            at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
            at scala.util.Try$.apply(Try.scala:192)
            at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
            at java.base/java.lang.Thread.run(Thread.java:835)
    19/08/01 15:16:09 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 46.6 KB, free 434.4 MB)
    19/08/01 15:16:09 INFO StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook
    19/08/01 15:16:09 INFO ReceiverTracker: Sent stop signal to all 1 receivers
    19/08/01 15:16:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 16.0 KB, free 434.3 MB)
    19/08/01 15:16:09 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on GGN1037742.bsci.bossci.com:54057 (size: 16.0 KB, free: 434.4 MB)
    19/08/01 15:16:09 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
    19/08/01 15:16:09 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (Receiver 0 ParallelCollectionRDD[0] at makeRDD at ReceiverTracker.scala:614) (first 15 tasks are for partitions Vector(0))
    19/08/01 15:16:09 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
    19/08/01 15:16:09 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 8459 bytes)
    19/08/01 15:16:09 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
    19/08/01 15:16:09 INFO Executor: Fetching spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar with timestamp 1564652768180
    19/08/01 15:16:09 INFO TransportClientFactory: Successfully created connection to GGN1037742.bsci.bossci.com/10.220.45.105:54035 after 31 ms (0 ms spent in bootstraps)
    19/08/01 15:16:09 INFO Utils: Fetching spark://GGN1037742.bsci.bossci.com:54035/jars/RBA-jar-with-dependencies.jar to C:\Users\sachdej\AppData\Local\Temp\spark-5b831314-0918-401a-88d8-1c6a135d554c\userFiles-99183ed0-5ef0-4d7d-8ea1-8e928e026847\fetchFileTemp6554652685231905365.tmp
    19/08/01 15:16:09 INFO Executor: Adding file:/C:/Users/sachdej/AppData/Local/Temp/spark-5b831314-0918-401a-88d8-1c6a135d554c/userFiles-99183ed0-5ef0-4d7d-8ea1-8e928e026847/RBA-jar-with-dependencies.jar to class loader
    19/08/01 15:16:09 INFO RecurringTimer: Started timer for BlockGenerator at time 1564652770000
    19/08/01 15:16:09 INFO BlockGenerator: Started BlockGenerator
    19/08/01 15:16:09 INFO BlockGenerator: Started block pushing thread
    19/08/01 15:16:09 INFO ReceiverSupervisorImpl: Stopping receiver with message: Registered unsuccessfully because Driver refused to start receiver 0:
    19/08/01 15:16:09 WARN ReceiverSupervisorImpl: Skip stopping receiver because it has not yet stared
    19/08/01 15:16:09 INFO BlockGenerator: Stopping BlockGenerator
    19/08/01 15:16:10 INFO RecurringTimer: Stopped timer for BlockGenerator after time 1564652770200
    19/08/01 15:16:10 INFO BlockGenerator: Waiting for block pushing thread to terminate
    19/08/01 15:16:10 INFO BlockGenerator: Pushing out the last 0 blocks
    19/08/01 15:16:10 INFO BlockGenerator: Stopped block pushing thread
    19/08/01 15:16:10 INFO BlockGenerator: Stopped BlockGenerator
    19/08/01 15:16:10 INFO ReceiverSupervisorImpl: Waiting for receiver to be stopped
    19/08/01 15:16:10 INFO ReceiverSupervisorImpl: Stopped receiver without error
    19/08/01 15:16:10 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 794 bytes result sent to driver
    19/08/01 15:16:10 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 965 ms on localhost (executor driver) (1/1)
    19/08/01 15:16:10 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
    19/08/01 15:16:10 INFO DAGScheduler: ResultStage 0 (start at SparkHome.java:40) finished in 1.229 s
    19/08/01 15:16:10 INFO ReceiverTracker: All of the receivers have deregistered successfully
    19/08/01 15:16:10 INFO ReceiverTracker: ReceiverTracker stopped
    19/08/01 15:16:10 INFO JobGenerator: Stopping JobGenerator immediately
    19/08/01 15:16:10 INFO RecurringTimer: Stopped timer for JobGenerator after time 1564652769000
    19/08/01 15:16:10 INFO JobGenerator: Stopped JobGenerator
    19/08/01 15:16:10 INFO JobScheduler: Stopped JobScheduler
    19/08/01 15:16:10 INFO StreamingContext: StreamingContext stopped successfully
    19/08/01 15:16:10 INFO SparkContext: Invoking stop() from shutdown hook
    19/08/01 15:16:10 INFO SparkUI: Stopped Spark web UI at http://GGN1037742.bsci.bossci.com:4040
    19/08/01 15:16:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
    19/08/01 15:16:10 INFO MemoryStore: MemoryStore cleared
    19/08/01 15:16:10 INFO BlockManager: BlockManager stopped
    19/08/01 15:16:10 INFO BlockManagerMaster: BlockManagerMaster stopped
    19/08/01 15:16:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
    19/08/01 15:16:10 INFO SparkContext: Successfully stopped SparkContext






    Master Logs after submitting the job are:
    19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
    java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
            at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
            at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
            at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
            at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
            at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
            at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
            at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
            at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
            at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:654)
            at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274)
            at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
    java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
            at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
            at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
            at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2355)
            at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2249)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2087)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)
            at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:430)
            at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
            at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)
            at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
            at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
            at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
            at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
            at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:654)
            at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274)
            at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)19/08/01 14:03:10 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
    java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
            at java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
            at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1903)
            at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1772)
            at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
            at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1594)

1 个答案:

答案 0 :(得分:1)

类似

的错误
java.lang.IllegalArgumentException: Unsupported class file major version 56

表示该类是在比我们尝试运行它的版本更高的Java版本上编译的。

v56表示您使用Java 12进行编译,并且我不知道您的Spark使用哪个Java版本。根据这张票证,Spark似乎还与Java 11不兼容:https://issues.apache.org/jira/browse/SPARK-24417

解决方案是使用jdk的早期版本编译代码。