Cassandra Spark Connector版本与spark 2.2冲突

时间:2019-01-28 09:53:16

标签: cassandra-3.0 apache-spark-2.2

运行spark作业时遇到错误。请为spark和cassandra连接器建议正确的版本。

下面是我的build.sbt

coef(cvfit, s = "lambda.min")

一旦提交Spark作业,我的错误率将低于

scalaVersion := "2.11.8"



libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-streaming" % "2.2.0-cdh6.0.1" % "provided",
  "org.apache.spark" %% "spark-core" % "2.2.0-cdh6.0.1" % "provided", // excludeAll ExclusionRule(organization = "javax.servlet"),
  "org.apache.spark" %% "spark-sql" % "2.2.0-cdh6.0.1" % "provided",
  "org.apache.spark" %% "`enter code here`spark-streaming-kafka-0-10" % "2.2.0-cdh6.0.1",
  "org.apache.hbase" % "hbase-client" % "2.0.0-cdh6.0.1",
  "org.apache.hbase" % "hbase-common" % "2.0.0-cdh6.0.1",
  "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.10",
  "net.liftweb" %% "lift-json" % "3.3.0",
  "com.typesafe" % "config" % "1.2.1"
)

1 个答案:

答案 0 :(得分:2)

spark-cassandra-connector遇到了类似的问题,所以看看它是如何工作的, 对于Spark 2.2版和Scala 11.8.0,spark-cassandra-connector 2.3.0可以使用。 添加commons-configuration 1.9版本jar,因为它将引发异常NoClassDefFound:/ org / apache / commons / configuration / ConfigurationException。 尝试以下依赖项:

 //image is of type LowResBitMap
 image.SetPixel(0, image.Height - 20, Color.Black);

 MessageBox.Show(image.Internal.GetPixel(0, image.Height - 20).ToString());
 MessageBox.Show(image.Internal.GetPixel(0, image.Height - 20 + 1).ToString());