我将Spark安装到C:\Spark1_6\spark-1.6.0-bin-hadoop2.6
。导航到此路径后,我输入sbt assembly
命令,我收到以下错误消息:
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':'
[error] Not a valid key: assembly
[error] assembly
[error] ^
这是我的sbt项目结构。
-Project101
-project
-build.properties
-plugins.sbt
-src
-build.sbt
这是我的build.sbt
:
name := "Project101"
version := "1.0"
scalaVersion := "2.10.2"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.6.0" exclude ("org.apache.hadoop","hadoop-yarn-server-web-proxy"),
"org.apache.spark" % "spark-sql_2.10" % "1.6.0" exclude ("org.apache.hadoop","hadoop-yarn-server-web-proxy"),
"org.apache.spark" %% "spark-hive" % "1.6.0",
"org.apache.spark" %% "spark-streaming" % "1.6.0",
"org.apache.spark" %% "spark-streaming-kafka" % "1.6.0"
)
resolvers in Global ++= Seq(
"Sbt plugins" at "https://dl.bintray.com/sbt/sbt-plugin-releases",
"Maven Central Server" at "http://repo1.maven.org/maven2",
"TypeSafe Repository Releases" at "http://repo.typesafe.com/typesafe/releases/",
"TypeSafe Repository Snapshots" at "http://repo.typesafe.com/typesafe/snapshots/"
)
以下是plugins.sbt
:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")
sbt package
命令正在运行并能够创建jar文件。但是我也必须执行sbt assembly
命令但是没有工作。
答案 0 :(得分:0)
无效命令:汇编
每当遇到错误消息时,请确保您位于安装了sbt-assembly插件的项目的顶级目录中。
如果您在Project101
目录中有一个项目,请确保project/plugins.sbt
的行位于:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.12.0")
这样,您应该再次位于Project101
目录中并执行sbt assembly
。那应该执行插件来创建一个uber-jar。