运行spark-submit的问题:java.lang.NoSuchMethodError:com.couchbase.spark.streaming.Mutation.key()

时间:2019-02-22 11:45:39

标签: scala apache-spark sbt spark-submit

我有以下scala代码,正在使用sbt进行编译和运行。 sbt run 可以正常工作。

import org.apache.spark.SparkConf
import org.apache.spark.streaming.{StreamingContext, Seconds}
import com.couchbase.spark.streaming._


object StreamingExample {

  def main(args: Array[String]): Unit = {

    // Create the Spark Config and instruct to use the travel-sample bucket
    // with no password.
    val conf = new SparkConf()
      .setMaster("local[*]")
      .setAppName("StreamingExample")
      .set("com.couchbase.bucket.travel-sample", "")

    // Initialize StreamingContext with a Batch interval of 5 seconds
    val ssc = new StreamingContext(conf, Seconds(5))

    // Consume the DCP Stream from the beginning and never stop.
    // This counts the messages per interval and prints their count.
    ssc
      .couchbaseStream(from = FromBeginning, to = ToInfinity)
        .foreachRDD(rdd => {
          rdd.foreach(message => {
            //println(message.getClass());
            message.getClass();
            if(message.isInstanceOf[Mutation]) {
              val document = message.asInstanceOf[Mutation].key.map(_.toChar).mkString
              println("mutated: " +  document);
            } else if( message.isInstanceOf[Deletion]) {
              val document = message.asInstanceOf[Deletion].key.map(_.toChar).mkString
              println("deleted: " + document);
            }
          })
        })

    // Start the Stream and await termination
    ssc.start()
    ssc.awaitTermination()
  }
}

,但是当以如下所示作为spark作业运行时,这将失败:     spark-submit --class“ StreamingExample” --master“ local [*]” target / scala-2.11 / spark-samples_2.11-1.0.jar

错误是 java.lang.NoSuchMethodError:com.couchbase.spark.streaming.Mutation.key()

以下是我的build.sbt

lazy val root = (project in file(".")).
  settings(
    name := "spark-samples",
    version := "1.0",
    scalaVersion := "2.11.12",
    mainClass in Compile := Some("StreamingExample")        
  )

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.4.0",
  "org.apache.spark" %% "spark-streaming" % "2.4.0",
  "org.apache.spark" %% "spark-sql" % "2.4.0",
  "com.couchbase.client" %% "spark-connector" % "2.2.0"
)

// META-INF discarding
assemblyMergeStrategy in assembly := {
       case PathList("META-INF", xs @ _*) => MergeStrategy.discard
       case x => MergeStrategy.first
   } 

在我的计算机上运行的spark版本是使用scala 2.11.12的2.4.0。

观察

我在火花罐(/usr/local/Cellar/apache-spark/2.4.0/libexec/jars中没有看到 com.couchbase.client_spark-connector_2.11-2.2.0 ),但旧版本 com.couchbase.client_spark-connector_2.10-1.2.0.jar 存在。

  • 为什么火花提交不起作用?
  • sbt如何运行此程序?它在哪里下载 依赖吗?

1 个答案:

答案 0 :(得分:1)

请确保SBT使用的Scala版本和spark连接器库版本与您的spark安装都相同。

当我尝试在系统上运行示例Flink作业时,遇到了类似的问题。原因是版本不匹配。