在intellij中不能包含org.apache.spark

时间:2018-05-28 16:00:10

标签: scala apache-spark intellij-idea

打开时,IntelliJ Idea项目为Maven。

当我import org.apache.spark时,没有问题 但是,当我尝试import scala.io.Source import org.apache.spark object main extends App { val lines = Source.fromFile("C://share_VB/file.csv").getLines.toArray for (line <- lines){ if (!line.isEmpty){ val testcase = line.split(",").toBuffer println(testcase.head) println(testcase(1)) testcase.remove(0, 2) while (testcase.nonEmpty){ println(testcase.head) println(testcase(1)) testcase.remove(0, 2) } } } } 时,它不起作用。

我该如何解决这个问题?如果您需要更多信息,请告诉我们。

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>seeifthisworks</groupId>
<artifactId>seeifthisworks</artifactId>
<version>1.0-SNAPSHOT</version>

<properties>
    <java.version>1.8</java.version>
    <scala.version>2.11.8</scala.version>
    <scala.compat.version>2.11</scala.compat.version>
    <spark.version>2.2.0.cloudera1</spark.version>
    <config.version>1.3.2</config.version>
    <scalatest.version>3.0.1</scalatest.version>
    <spark-testing-base.version>2.2.0_0.8.0</spark-testing-base.version>
</properties>

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.3.0-palantir3</version>
</dependency>


</project>

pom.xml文件

import numpy as np
import matplotlib.pyplot as plt

f, ax = plt.subplots()
ax.plot(range(100))

ymin, ymax = ax.get_ylim()
ax.set_yticks(np.round(np.linspace(ymin, ymax, N), 2))

1 个答案:

答案 0 :(得分:0)

修复Spark版本以删除2.3.0-palantir3

后,将${spark.version}替换为cloudera1

在任何情况下,你都没有火花码,所以你不清楚你想用它做什么