如何使用pyspark将单行附加到cassandra表?

时间:2017-04-06 09:50:03

标签: python cassandra pyspark

我想在已存在的Cassandra表中插入一个新行。我正在使用pyspark_cassandra。

spark版本 - 1.4.1

scala版本 - 2.10.6

cassandra版本 - 2.2.3

python版本 - 2.7.6

Python脚本 -

from pyspark.conf import SparkConf
from pyspark_cassandra import CassandraSparkContext,Row
import pyspark_cassandra

conf = SparkConf().setAppName("PySpark Cassandra Test").setMaster("spark://0.0.0.0:7077").set("spark.cassandra.connection.host","http://127.0.0.1")

sc = CassandraSparkContext(conf=conf)
rdd = sc.parallelize([{
    "id": 101,
    "name": "ketan",
    }
])

rdd.saveToCassandra("users","temp")

在脚本上方运行的命令 -

sudo /usr/local/spark/bin/spark-submit --jars /path/to/pyspark-cassandra-assembly-<version>.jar \
--driver-class-path /path/to/pyspark-cassandra-assembly-<version>.jar \
--py-files /path/to/pyspark-cassandra-assembly-<version>.jar \
--conf spark.cassandra.connection.host=your,cassandra,node,names \
--master spark://spark-master:7077 \
yourscript.py

我因为以下错误而陷入困境 -

Traceback (most recent call last):


File "/home/yourscript.py", line 13, in <module>
    rdd.saveToCassandra("users","temp")
  File "/usr/local/spark/pyspark-cassandra-master/target/scala-2.10/pyspark-cassandra-assembly-0.3.5.jar/pyspark_cassandra/rdd.py", line 83, in saveToCassandra
  File "/usr/local/spark/pyspark-cassandra-master/target/scala-2.10/pyspark-cassandra-assembly-0.3.5.jar/pyspark_cassandra/util.py", line 93, in helper
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
  File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o26.newInstance.
: java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
    at com.datastax.spark.connector.types.TypeConverter$.<init>(TypeConverter.scala:116)
    at com.datastax.spark.connector.types.TypeConverter$.<clinit>(TypeConverter.scala)
    at pyspark_cassandra.PythonHelper.<init>(PythonHelper.scala:36)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at java.lang.Class.newInstance(Class.java:442)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:280)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:214)
    at java.lang.Thread.run(Thread.java:745)

17/04/06 15:15:33 INFO SparkContext: Invoking stop() from shutdown hook
17/04/06 15:15:33 INFO SparkUI: Stopped Spark web UI at http://192.168.195.119:4040
17/04/06 15:15:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/04/06 15:15:33 INFO MemoryStore: MemoryStore cleared
17/04/06 15:15:33 INFO BlockManager: BlockManager stopped
17/04/06 15:15:33 INFO BlockManagerMaster: BlockManagerMaster stopped
17/04/06 15:15:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/04/06 15:15:33 INFO SparkContext: Successfully stopped SparkContext
17/04/06 15:15:33 INFO ShutdownHookManager: Shutdown hook called
17/04/06 15:15:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-3d6cd846-4e37-47d4-b1e0-d9e68a7d34b3/pyspark-c2b7061a-44f2-4e1c-bf90-ea30b81c3c7d
17/04/06 15:15:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-3d6cd846-4e37-47d4-b1e0-d9e68a7d34b3

我尝试在pyspark shell上执行上面的代码,它对我来说很好。数据被插入到Cassandra表中,但是在使用python脚本时我就停留在这一点上。

我还缺少一些配置吗?

2 个答案:

答案 0 :(得分:1)

这是Scala版本不匹配错误。一些组件使用Scala 2.10,其他组件使用2.11。

有关详细信息,请参阅Spark Cassandra Connector Faq

解决方案是确保所有库都使用相同版本的Scala。

答案 1 :(得分:0)

尝试使用以下命令运行

spark-submit --packages anguenot/pyspark-cassandra:2.4.0 yourscript.py