如何设置spark build.sbt文件?

时间:2018-03-25 21:30:56

标签: scala apache-spark intellij-idea sbt

我一整天都在努力,无法弄清楚如何让它发挥作用。

所以我有一个''' OP wants to convert a 3 dimensional numpy array from the Boolean array, to an int array. Use the astype function and pass int to the function in order to convert the array to the integer type ''' import numpy as np x = np.array([0, 0, -1, 1, 0, 1, -1, 1, 0]).reshape(3, 3)[:, np.newaxis] x = x.astype(int) print(x) 库,它将成为common的核心库。

我的spark文件无效:

build.sbt

所有评论都是我所做的所有测试,我不知道该怎么做。

我的目标是让火花2.3工作并让name := "CommonLib" version := "0.1" scalaVersion := "2.12.5" // addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6") // resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/" // resolvers += Resolver.sonatypeRepo("public") libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"), "org.apache.spark" % "spark-sql_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"), "org.apache.hadoop" % "hadoop-common" % "2.7.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"), // "org.apache.spark" % "spark-sql_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"), "org.apache.spark" % "spark-hive_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"), "org.apache.spark" % "spark-yarn_2.10" % "1.6.0" exclude("org.apache.hadoop", "hadoop-yarn-server-web-proxy"), "com.github.scopt" %% "scopt" % "3.7.0" ) //addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6") //libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0" //libraryDependencies ++= { // val sparkVer = "2.1.0" // Seq( // "org.apache.spark" %% "spark-core" % sparkVer % "provided" withSources() // ) //} 也可用。

对于我的sbt版本,我安装了scope

谢谢。

2 个答案:

答案 0 :(得分:2)

我认为我有两个主要问题。

  1. Spark尚未与scala 2.12兼容。所以转向2.11.12解决了一个问题
  2. 第二个问题是,对于intelliJ SBT控制台重新加载build.sbt,你需要杀死并重新启动控制台或使用我不知道的reload命令,所以我实际上并没有使用最新版本。 sbt文件。

答案 1 :(得分:-1)

有一个Giter8模板可以很好地运行:

https://github.com/holdenk/sparkProjectTemplate.g8