使用python进行火花流时获取错误?

时间:2017-10-17 21:29:43

标签: python apache-spark apache-kafka spark-streaming

使用kafka_2.11-0.9.0.0和zookeeper-3.4.9。

我已经启动了zookeeper服务以及生产者和消费者。但是,当我运行spark submit命令时,它会抛出错误。 我使用下面的命令提交火花作业:

spark-submit --packages org.apache.spark:spark-streaming-kafka_2.11:1.5.0 /usr/local/spark/examples/src/main/python/streaming/kafka_wordcount.py localhost:2181 Hello-Kafka

我在日志中遇到错误。

这是我得到的日志:

17/10/18 02:44:59 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.150.105, 44389)
Traceback (most recent call last):
  File "/usr/local/spark/examples/src/main/python/streaming/kafka_wordcount.py", line 48, in <module>
    kvs = KafkaUtils.createStream(ssc, zkQuorum, "spark-streaming-consumer", {topic: 1})
  File "/usr/local/spark/python/lib/pyspark.zip/pyspark/streaming/kafka.py", line 70, in createStream
  File "/usr/local/spark/python/lib/py4j-0.10.3-src.zip/py4j/java_gateway.py", line 1133, in __call__
  File "/usr/local/spark/python/lib/py4j-0.10.3-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o27.createStream.
: java.lang.NoClassDefFoundError: org/apache/spark/Logging
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:81)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:151)
    at org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createStream(KafkaUtils.scala:555)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:280)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:214)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 25 more

17/10/18 02:45:01 INFO SparkContext: Invoking stop() from shutdown hook
17/10/18 02:45:01 INFO SparkUI: Stopped Spark web UI at http://192.168.150.105:4040
17/10/18 02:45:01 INFO ContextCleaner: Cleaned accumulator 0
17/10/18 02:45:01 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/10/18 02:45:01 INFO MemoryStore: MemoryStore cleared
17/10/18 02:45:01 INFO BlockManager: BlockManager stopped
17/10/18 02:45:01 INFO BlockManagerMaster: BlockManagerMaster stopped
17/10/18 02:45:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/10/18 02:45:01 INFO SparkContext: Successfully stopped SparkContext
17/10/18 02:45:01 INFO ShutdownHookManager: Shutdown hook called
17/10/18 02:45:01 INFO ShutdownHookManager: Deleting directory /tmp/spark-ba22aed0-b62d-48b2-8796-12ae197a5b1c/pyspark-189ffe1d-160f-4b6c-8bb2-17a5b7dcb5b7
17/10/18 02:45:01 INFO ShutdownHookManager: Deleting directory /tmp/spark-ba22aed0-b62d-48b2-8796-12ae197a5b1c

任何想法??

1 个答案:

答案 0 :(得分:1)

看起来你正在尝试使用spark 2.X和来自spark 1.5 ...

的库

修复您的--packages选项,传递有效的库版本。您可以直接从maven repository获取可能的版本。

相关问题