无法找到CassandraSQLContext

时间:2016-03-23 09:11:29

标签: apache-spark apache-spark-sql spark-streaming datastax

我正在使用带有hadoop 2.6的spark 1.6 prebuild spark cassandra连接器1.6 cassandra 2.1.12

我写了一个简单的scala程序来运行简单的select count(*)查询cassandra这里是我的代码

import org.apache.spark.{SparkContext, SparkConf} 
import com.datastax.spark.connector._
import org.apache.spark.sql.cassandra.CassandraSQLContext
import org.apache.spark.sql._


object Hi {
  def main(args: Array[String])
  {
        val conf = new SparkConf(true).set("spark.cassandra.connection.host", "172.16.4.196")

        val sc = new SparkContext("spark://naresh-pc:7077", "test", conf)
    val csc = new CassandraSQLContext(sc)
    val rdd1 = csc.sql("SELECT count(*) from cw.testdata")

        println(rdd1.count)
        println(rdd1.first)
  } 
}

它成功构建了sbt程序集并创建了jar 当我提交使用spark提交 它给出了以下错误

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/cassandra/CassandraSQLContext
    at Hi$.main(trySpark.scala:15)
    at Hi.main(trySpark.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.cassandra.CassandraSQLContext
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

对此有何帮助?

此外,当我使用spark-shell运行时,它可以正常运行

0 个答案:

没有答案