Eclipse:Maven:错误:找不到:键入SparkConf

时间:2018-01-15 12:04:17

标签: scala maven apache-spark

我正在尝试在Eclipse中执行Spark / Scala代码片段(如下所示)。我为它创建了一个Maven项目,但是当我尝试运行代码时出现以下错误:

not found: type SparkConf

我的代码是:

package extraction

import org.apache.spark._
import org.apache.spark.SparkConf

object JsonParser {

    val conf = new SparkConf().setAppName("Spark json extract")
    conf.setMaster("local");
    val sc = new SparkContext(conf)
    val sqlContext = new SQLContext(sc)

    def main(args: Array[String]): Unit = {
        val df = sqlContext.read.json("F:\\test1.json")
        df.registerTempTable("jsonExtract")
        val data = sqlContext.sql("select * from jsonExtract")
        data.show();
        sc.stop
    }

}

的pom.xml

    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
      <modelVersion>4.0.0</modelVersion>
      <groupId>JSON</groupId>
      <artifactId>JSON</artifactId>
      <version>0.0.1-SNAPSHOT</version>
<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
</dependencies>
      <build>
        <sourceDirectory>src</sourceDirectory>
        <resources>
          <resource>
            <directory>src</directory>
            <excludes>
              <exclude>**/*.java</exclude>
            </excludes>
          </resource>
        </resources>
        <plugins>
          <plugin>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.7.0</version>
            <configuration>
              <source>1.8</source>
              <target>1.8</target>
            </configuration> 
          </plugin>
        </plugins>
      </build>
    </project>

我如何解决此错误?是不是可以在Eclipse中构建这个项目?

1 个答案:

答案 0 :(得分:1)

我认为您的<dependencies> </dependencies>标记丢失了。 See Maven POM

修改

它也可能是存储库问题。检查您的回购中是否有正确的库: