库不在intellij中的sbt中解析,而是从命令行解析和编译

时间:2016-09-03 08:41:35

标签: scala intellij-idea apache-spark spark-dataframe databricks

以下的sbt文件无法解析来自intelliJ Idea的spark-xml数据库包,其中使用命令行可以正常工作

name := "dataframes"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion="1.4.0"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "com.databricks" %% "spark-xml" % "0.3.3"
)

resolvers ++= Seq(
  "Apache HBase" at "https://repository.apache.org/content/repositories/releases",
  "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)

resolvers += Resolver.mavenLocal

sbt设置为捆绑和其他指向本地安装的sbt,但无论如何都无法正常工作。

以下软件包可以从命令行解析并完美运行

import com.databricks.spark._

0 个答案:

没有答案