我需要导入什么才能使`SparkConf`可解析?

时间:2015-11-18 13:54:29

标签: datastax-enterprise

我正在设置Java Spark应用程序并遵循Datastax documentation on getting started with the Java API。我添加了

<dependencies>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.1.1</version>
    </dependency>
    ...
</dependencies>

和(以前安装的dse.jar到我当地的Maven存储库)

<dependency>
    <groupId>com.datastax</groupId>
    <artifactId>dse</artifactId>
    <version>version number</version>
</dependency>

。本指南的下一步是

SparkConf conf = DseSparkConfHelper.enrichSparkConf(new SparkConf())
                .setAppName( "My application");
DseSparkContext sc = new DseSparkContext(conf);

。但是,课程SparkConf无法解决。应该是?我错过了一些额外的Maven依赖吗?这吗

1 个答案:

答案 0 :(得分:0)

该类为org.apache.spark.SparkConf,它位于spark-core_ scala版本工件中。

所以你的pom.xml可能如下所示:

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.4.1</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.5.0-M2</version>
    </dependency>
    <dependency>
        <groupId>com.datastax</groupId>
        <artifactId>dse</artifactId>
        <version>*version number*</version>
    </dependency>
</dependencies>

spark-core JAR也位于: dse_install /resources/spark/lib/spark_core_2.10- version .jar(tarball) 要么: /usr/share/dse/spark/lib/spark_core_2.10-version.jar(包安装)