获取sparksession和SQLContext的依赖项错误

时间:2018-10-26 09:19:31

标签: scala maven apache-spark dependencies

在我的Spark程序中,我的SQLContext和sparksession出现依赖错误

val sqlContext = new SQLContext(sc)
val spark = SparkSession.builder()

SQLCOntext错误

Symbol 'type org.apache.spark.Logging' is missing from the classpath. This symbol is required by 'class org.apache.spark.sql.SQLContext'. Make sure that type Logging is in your classpath and check for conflicting dependencies with -Ylog-classpath. A full rebuild may help if 'SQLContext.class' was compiled against an incompatible version of org.apache.spark.

SparkSession错误:

not found: value SparkSession

下面是我pom.xml中的spark依赖项

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-catalyst_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-test-tags_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>

1 个答案:

答案 0 :(得分:1)

您不能在项目中同时定义Spark 2和Spark 1.6依赖项。 org.apache.spark.Logging在Spark 2中不再可用。

更改

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.0-cdh5.15.1</version>
</dependency>