RDD发布SparkSession

时间:2019-02-09 15:12:16

标签: scala apache-spark bigdata

我是新来的scala人,现在可以自己练习。能帮您解决问题

  

无法在Scala中解析符号SparkSession

当我import org.apache.spark.sql.SparkSession在Scala中练习RDD和转换时。

1 个答案:

答案 0 :(得分:2)

似乎您错过了依赖项,因此,如果您使用Maven,则可以在pom.xml中添加以下内容

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.project.lib</groupId>
    <artifactId>PROJECT</artifactId>
    <version>1.0-SNAPSHOT</version>

    <dependencies>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.1.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.1.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.11</artifactId>
            <version>2.1.1</version>
        </dependency>
    </dependencies>
</project>

但是,如果您使用sbt,请在sbt.build中使用以下示例

name := "SparkTest"

version := "0.1"

scalaVersion := "2.11.8"

val sparkVersion = "2.3.0"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-hive" % sparkVersion

)