sbt package命令:[error]解析表达式时出错。确保设置由空行分隔

时间:2017-08-08 23:36:54

标签: scala apache-spark centos sbt

我正在尝试运行一个简单的命令 sbt package ,但由于下面给出了以下错误(下面的粗体命令),它失败了。如果有人解决我的问题,我会很感激吗?我的火花版本是2.0,scala版本是2.11.8,在centOS cloudera上使用jdk 1.7

[root@hadoop first]# vim build.sbt 
    name := "First Spark"
    version := "1.0"
    organization := "in.goai"
    scalaVersion := "2.11.8"
    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
    resolvers += Resolver.mavenLocal


[root@hadoop first]# ls
    build.sbt  project  src


**[root@hadoop first]# sbt package**
[info] Loading project definition from /home/training/Documents/workspace_scala/first/project
[info] Updating {file:/home/training/Documents/workspace_scala/first/project/}first-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
/home/training/Documents/workspace_scala/first/build.sbt:2: error: eof expected but ';' found.
version := "1.0"
^
[error] Error parsing expression.  Ensure that settings are separated by blank lines.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q

1 个答案:

答案 0 :(得分:0)

只需在您的债务文件中的每一行之间输入空格。希望有所帮助。

name := "First Spark"

version := "1.0"

organization := "in.goai"

scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"

resolvers += Resolver.mavenLocal