在Java类中找不到DataFrame Spark

时间:2018-05-14 16:29:45

标签: java apache-spark dataframe apache-spark-sql

我使用Spark编写Java类。我有这个错误:" DataFrame无法解析为类型"和导入错误:"导入org.apache.spark.sql.DataFrame"不可能是  解决。这是类导入:

import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.sql.DataFrameReader;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;

import org.apache.spark.sql.DataFrame;

这是文件pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>SparkBD</groupId>
    <artifactId>SparkProject</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <dependencies>
        <dependency> <!-- Spark dependency -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
    </dependencies>
</project>

1 个答案:

答案 0 :(得分:2)

DataFrame已在Spark 2.0中的Java API中删除(在Scala API中它只是一个别名)。您应该将其替换为Dataset<Row>

  • 仅保留import org.apache.spark.sql.Dataset
  • 无论您在何处使用DataFrame,都使用Dataset<Row>