我的代码:
object fileOpts {
def main(args: Array[String]) {
val conf = new SparkConf()
.setMaster("local[*]")
.setAppName("SparkCassandra")
.set("spark.cassandra.connection.host", "x.x.x.x")
val sc = new SparkContext(conf)
val rdd = sc.cassandraTable("keyspace", "table")
val file_collect = rdd.collect().take(100)
file_collect.foreach(println(_))
sc.stop()
}
}
我正在pom文件中使用以下条目:-
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.4.0-M1</version>
</dependency>
但是我面临以下错误:-
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at org.apache.spark.util.TimeStampedWeakValueHashMap.<init>(TimeStampedWeakValueHashMap.scala:42)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:277)
at com.spark.test.raj.fileOpts$.main(fileOpts.scala:30)
at com.spark.test.raj.fileOpts.main(fileOpts.scala)
Caused by: java.lang.ClassNotFoundException:
scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 4 more
159 [Thread-0]信息org.apache.spark.util.Utils-名为Shutdown的钩子