在java上为cassandra设置spark需要一些帮助

时间:2015-06-24 12:25:16

标签: java eclipse apache-spark spark-cassandra-connector

设置spark以访问java上的cassandra会抛出NoClassDefFoundError

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(Unknown Source)
    at java.security.SecureClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.access$100(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at Client.main(Client.java:22)
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    ... 13 more

添加了两个jar文件。 spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar&火花core_2.10-0.9.0-incubating.jar。 spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar是针对scala 2.10构建的。 在命令提示符下键入scala -version显示scala代码运行器版本2.11.6。从火花壳中获取火花没有问题。即使从spark-shell访问cassandra列系列也能正常工作。

import java.util.*;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import com.datastax.spark.connector.*;
import com.datastax.spark.connector.cql.*;
import com.datastax.spark.*;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.PairFunction;
//import scala.Tuple2;
import org.apache.spark.api.java.*;

public class Client {
    public static void main(String[] a)
    {
        SparkConf conf = new SparkConf().setAppName("MTMPNLTesting").setMaster("192.168.1.15");
    }
}

错误可能是什么原因?

1 个答案:

答案 0 :(得分:0)

还要在您的课程路径中加入Scala Jar。如果您不使用Maven,请下载jar并将其包含在Project构建属性中。