使用spark-cassandra连接器时出现以下错误:
ERROR executor.Executor: Exception in task 0.0 in stage 10.0 (TID 207)
java.lang.NoSuchMethodError: org.apache.spark.executor.TaskMetrics.inputMetrics_$eq(Lscala/Option;)V
at com.datastax.spark.connector.metrics.InputMetricsUpdater$.apply(InputMetricsUpdater.scala:61)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.compute(CassandraTableScanRDD.scala:196)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
我的代码:
SparkConf conf = new SparkConf(true).setMaster("local").setAppName("org.sparkexample.SparkCassandra")
.set("spark.executor.memory", "1g").set("spark.cassandra.connection.host", "localhost")
.set("spark.cassandra.connection.native.port", "9042")
.set("spark.cassandra.connection.rpc.port", "9160");
SparkContext ctx = new SparkContext(conf);
SparkContextJavaFunctions functions = CassandraJavaUtil.javaFunctions(ctx);
JavaRDD<String> cassandraRowsRDD = functions.cassandraTable("sparktest", "SPARK_PERSON").map(
new Function<CassandraRow, String>()
{
public String call(CassandraRow cassandraRow) throws Exception
{
return cassandraRow.toString();
}
});
System.out.println("Data as CassandraRows: \n" + StringUtils.join(cassandraRowsRDD.toArray(), "\n"));
我试图谷歌这个问题,发现它可以通过使用兼容的scala版本来解决。但我使用的是java连接器。
如何解决这个问题?
感谢。
答案 0 :(得分:3)
我将火花版本从1.2.1
降级为1.3.1
以暂时解决此问题。我使用的是spark-cassandra-connector-java_2.10
版本1.2.1
。