Spark Kafka Connector中导致“未知解析器null”的原因是什么?

时间:2017-11-08 21:12:13

标签: java apache-spark apache-kafka spark-streaming spark-submit

我是新来的火花,我已经在我的本地启动了zookeeper,kafka(0.10.1.1),还有一个主人和2个工人的火花独立(2.2.0)。我的本地版本是2.12.3

我能够在spark上运行wordcount,并使用kafka控制台生产者和消费者来自kafka主题的发布/订阅消息。

我遇到的问题是:每当我使用spark-submit --packages添加kafka包时,我都会

...
:: problems summary ::
:::: ERRORS
    unknown resolver null

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
    confs: [default]
    0 artifacts copied, 13 already retrieved (0kB/9ms)
...

即使我根本不使用kafka连接器。 详细日志如下:

命令

$SPARK_HOME/bin/spark-submit --packages org.apache.spark:spark-streaming-kafka-0-10_2.11:2.2.0 --master spark://TUSMA06RMLVT047:7077 build/libs/sparkdriver-1.0-SNAPSHOT.jar

日志

Ivy Default Cache set to: /Users/v0001/.ivy2/cache
The jars for the packages stored in: /Users/v0001/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/Cellar/apache-spark/2.2.0/libexec/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.spark#spark-streaming-kafka-0-10_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
    confs: [default]
    found org.apache.spark#spark-streaming-kafka-0-10_2.11;2.2.0 in local-m2-cache
    found org.apache.kafka#kafka_2.11;0.10.0.1 in local-m2-cache
    found com.101tec#zkclient;0.8 in local-m2-cache
    found org.slf4j#slf4j-api;1.7.16 in spark-list
    found org.slf4j#slf4j-log4j12;1.7.16 in spark-list
    found log4j#log4j;1.2.17 in spark-list
    found com.yammer.metrics#metrics-core;2.2.0 in local-m2-cache
    found org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4 in spark-list
    found org.apache.kafka#kafka-clients;0.10.0.1 in local-m2-cache
    found net.jpountz.lz4#lz4;1.3.0 in spark-list
    found org.xerial.snappy#snappy-java;1.1.2.6 in spark-list
    found org.apache.spark#spark-tags_2.11;2.2.0 in local-m2-cache
    found org.spark-project.spark#unused;1.0.0 in spark-list
:: resolution report :: resolve 1805ms :: artifacts dl 14ms
    :: modules in use:
    com.101tec#zkclient;0.8 from local-m2-cache in [default]
    com.yammer.metrics#metrics-core;2.2.0 from local-m2-cache in [default]
    log4j#log4j;1.2.17 from spark-list in [default]
    net.jpountz.lz4#lz4;1.3.0 from spark-list in [default]
    org.apache.kafka#kafka-clients;0.10.0.1 from local-m2-cache in [default]
    org.apache.kafka#kafka_2.11;0.10.0.1 from local-m2-cache in [default]
    org.apache.spark#spark-streaming-kafka-0-10_2.11;2.2.0 from local-m2-cache in [default]
    org.apache.spark#spark-tags_2.11;2.2.0 from local-m2-cache in [default]
    org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4 from spark-list in [default]
    org.slf4j#slf4j-api;1.7.16 from spark-list in [default]
    org.slf4j#slf4j-log4j12;1.7.16 from spark-list in [default]
    org.spark-project.spark#unused;1.0.0 from spark-list in [default]
    org.xerial.snappy#snappy-java;1.1.2.6 from spark-list in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   13  |   2   |   2   |   0   ||   13  |   0   |
    ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
    unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
    confs: [default]
    0 artifacts copied, 13 already retrieved (0kB/9ms)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/11/08 15:53:55 INFO SparkContext: Running Spark version 2.2.0
17/11/08 15:53:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/11/08 15:53:55 INFO SparkContext: Submitted application: WordCount
17/11/08 15:53:55 INFO SecurityManager: Changing view acls to: v0001
17/11/08 15:53:55 INFO SecurityManager: Changing modify acls to: v0001
17/11/08 15:53:55 INFO SecurityManager: Changing view acls groups to: 
17/11/08 15:53:55 INFO SecurityManager: Changing modify acls groups to: 
17/11/08 15:53:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(v0001); groups with view permissions: Set(); users  with modify permissions: Set(v0001); groups with modify permissions: Set()
17/11/08 15:53:55 INFO Utils: Successfully started service 'sparkDriver' on port 63760.
17/11/08 15:53:55 INFO SparkEnv: Registering MapOutputTracker
17/11/08 15:53:55 INFO SparkEnv: Registering BlockManagerMaster
17/11/08 15:53:55 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/11/08 15:53:55 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/11/08 15:53:55 INFO DiskBlockManager: Created local directory at /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/blockmgr-b6a7af13-30eb-43ef-a235-e42105699289
17/11/08 15:53:55 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
17/11/08 15:53:55 INFO SparkEnv: Registering OutputCommitCoordinator
17/11/08 15:53:55 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/11/08 15:53:55 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.1.2:4040
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar at spark://10.0.1.2:63760/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar with timestamp 1510174435998
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar at spark://10.0.1.2:63760/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar at spark://10.0.1.2:63760/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at spark://10.0.1.2:63760/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/com.101tec_zkclient-0.8.jar at spark://10.0.1.2:63760/jars/com.101tec_zkclient-0.8.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.slf4j_slf4j-log4j12-1.7.16.jar at spark://10.0.1.2:63760/jars/org.slf4j_slf4j-log4j12-1.7.16.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at spark://10.0.1.2:63760/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar at spark://10.0.1.2:63760/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar at spark://10.0.1.2:63760/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.slf4j_slf4j-api-1.7.16.jar at spark://10.0.1.2:63760/jars/org.slf4j_slf4j-api-1.7.16.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/log4j_log4j-1.2.17.jar at spark://10.0.1.2:63760/jars/log4j_log4j-1.2.17.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar at spark://10.0.1.2:63760/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar at spark://10.0.1.2:63760/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/iot/thingspace/go/src/stash.verizon.com/npdthing/metrics/sparkdriver/build/libs/sparkdriver-1.0-SNAPSHOT.jar at spark://10.0.1.2:63760/jars/sparkdriver-1.0-SNAPSHOT.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO Executor: Starting executor ID driver on host localhost
17/11/08 15:53:56 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 63761.
17/11/08 15:53:56 INFO NettyBlockTransferService: Server created on 10.0.1.2:63761
17/11/08 15:53:56 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/11/08 15:53:56 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.1.2:63761 with 366.3 MB RAM, BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 236.5 KB, free 366.1 MB)
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 22.9 KB, free 366.0 MB)
17/11/08 15:53:56 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.0.1.2:63761 (size: 22.9 KB, free: 366.3 MB)
17/11/08 15:53:56 INFO SparkContext: Created broadcast 0 from textFile at WordCount.java:15
17/11/08 15:53:56 INFO FileInputFormat: Total input paths to process : 1
17/11/08 15:53:56 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
17/11/08 15:53:56 INFO SparkContext: Starting job: saveAsTextFile at WordCount.java:21
17/11/08 15:53:56 INFO DAGScheduler: Registering RDD 2 (flatMapToPair at WordCount.java:18)
17/11/08 15:53:56 INFO DAGScheduler: Got job 0 (saveAsTextFile at WordCount.java:21) with 1 output partitions
17/11/08 15:53:56 INFO DAGScheduler: Final stage: ResultStage 1 (saveAsTextFile at WordCount.java:21)
17/11/08 15:53:56 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
17/11/08 15:53:56 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
17/11/08 15:53:56 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at flatMapToPair at WordCount.java:18), which has no missing parents
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.3 KB, free 366.0 MB)
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.1 KB, free 366.0 MB)
17/11/08 15:53:56 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.0.1.2:63761 (size: 3.1 KB, free: 366.3 MB)
17/11/08 15:53:56 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
17/11/08 15:53:56 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at flatMapToPair at WordCount.java:18) (first 15 tasks are for partitions Vector(0))
17/11/08 15:53:56 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/11/08 15:53:56 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 4937 bytes)
17/11/08 15:53:56 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
17/11/08 15:53:56 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-api-1.7.16.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO TransportClientFactory: Successfully created connection to /10.0.1.2:63760 after 30 ms (0 ms spent in bootstraps)
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-api-1.7.16.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp4839646631087629609.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.slf4j_slf4j-api-1.7.16.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp8667361266232337100.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.kafka_kafka_2.11-0.10.0.1.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-log4j12-1.7.16.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-log4j12-1.7.16.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp5418243157152191799.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.slf4j_slf4j-log4j12-1.7.16.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2366789843424249528.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2527586655699915856.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.spark_spark-tags_2.11-2.2.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.spark-project.spark_unused-1.0.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp4436635514367901872.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.spark-project.spark_unused-1.0.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/com.101tec_zkclient-0.8.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/com.101tec_zkclient-0.8.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp4322710809557945921.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/com.101tec_zkclient-0.8.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar with timestamp 1510174435998
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp6210645736090344233.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/log4j_log4j-1.2.17.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/log4j_log4j-1.2.17.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2587760876873828850.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/log4j_log4j-1.2.17.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/com.yammer.metrics_metrics-core-2.2.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp8763096223513955185.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/com.yammer.metrics_metrics-core-2.2.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2368772990989848791.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.kafka_kafka-clients-0.10.0.1.jar to class loader
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp5933403694236070460.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.xerial.snappy_snappy-java-1.1.2.6.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/sparkdriver-1.0-SNAPSHOT.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/sparkdriver-1.0-SNAPSHOT.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp9172284954823303788.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/sparkdriver-1.0-SNAPSHOT.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/net.jpountz.lz4_lz4-1.3.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2018048990610379910.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/net.jpountz.lz4_lz4-1.3.0.jar to class loader
17/11/08 15:53:57 INFO HadoopRDD: Input split: file:/usr/local/Cellar/apache-spark/2.2.0/libexec/logs/spark-v0001-org.apache.spark.deploy.master.Master-1-TUSMA06RMLVT047.out:0+3424
17/11/08 15:53:57 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1192 bytes result sent to driver
17/11/08 15:53:57 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 571 ms on localhost (executor driver) (1/1)
17/11/08 15:53:57 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
17/11/08 15:53:57 INFO DAGScheduler: ShuffleMapStage 0 (flatMapToPair at WordCount.java:18) finished in 0.590 s
17/11/08 15:53:57 INFO DAGScheduler: looking for newly runnable stages
17/11/08 15:53:57 INFO DAGScheduler: running: Set()
17/11/08 15:53:57 INFO DAGScheduler: waiting: Set(ResultStage 1)
17/11/08 15:53:57 INFO DAGScheduler: failed: Set()
17/11/08 15:53:57 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[4] at saveAsTextFile at WordCount.java:21), which has no missing parents
17/11/08 15:53:57 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 72.6 KB, free 366.0 MB)
17/11/08 15:53:57 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 26.1 KB, free 365.9 MB)
17/11/08 15:53:57 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 10.0.1.2:63761 (size: 26.1 KB, free: 366.2 MB)
17/11/08 15:53:57 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1006
17/11/08 15:53:57 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[4] at saveAsTextFile at WordCount.java:21) (first 15 tasks are for partitions Vector(0))
17/11/08 15:53:57 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
17/11/08 15:53:57 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 4621 bytes)
17/11/08 15:53:57 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
17/11/08 15:53:57 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks
17/11/08 15:53:57 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 4 ms
17/11/08 15:53:57 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
17/11/08 15:53:57 INFO FileOutputCommitter: Saved output of task 'attempt_20171108155356_0001_m_000000_1' to file:/Users/v0001/iot/thingspace/go/src/stash.verizon.com/npdthing/metrics/sparkdriver/wordcount.out/_temporary/0/task_20171108155356_0001_m_000000
17/11/08 15:53:57 INFO SparkHadoopMapRedUtil: attempt_20171108155356_0001_m_000000_1: Committed
17/11/08 15:53:57 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1224 bytes result sent to driver
17/11/08 15:53:57 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 123 ms on localhost (executor driver) (1/1)
17/11/08 15:53:57 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
17/11/08 15:53:57 INFO DAGScheduler: ResultStage 1 (saveAsTextFile at WordCount.java:21) finished in 0.123 s
17/11/08 15:53:57 INFO DAGScheduler: Job 0 finished: saveAsTextFile at WordCount.java:21, took 0.852166 s
17/11/08 15:53:57 INFO SparkContext: Invoking stop() from shutdown hook
17/11/08 15:53:57 INFO SparkUI: Stopped Spark web UI at http://10.0.1.2:4040
17/11/08 15:53:57 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/11/08 15:53:57 INFO MemoryStore: MemoryStore cleared
17/11/08 15:53:57 INFO BlockManager: BlockManager stopped
17/11/08 15:53:57 INFO BlockManagerMaster: BlockManagerMaster stopped
17/11/08 15:53:57 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/11/08 15:53:57 INFO SparkContext: Successfully stopped SparkContext
17/11/08 15:53:57 INFO ShutdownHookManager: Deleting directory /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965

1 个答案:

答案 0 :(得分:3)

遇到同样的问题后,我删除了〜/ .iv2下的ivy2缓存和〜/ .m2下的maven缓存。 这解决了我不时使用各种不同包装的问题。从一个Scala版本切换到另一个版本时,主要是遇到了这个问题。