我一直在使用apache spark(scala)并使用sbt构建软件包。我能够构建软件包,但是当我做
时,我一直得到Exception in thread "main" java.net.URISyntaxException: Illegal character in path at index 0:
./bin/spark-submit \ "/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar" --help
我不明白为什么会这样。
这是我的代码
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD
object creditFraud {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Transaction")
val sc = new SparkContext(conf)
val graph = GraphLoader.edgeListFile(sc,"Users/grantherman/Desktop/transactionFile.csv")
println("GRAPHX: Number of vertices " + graph.vertices.count)
println("GRAPHX: Number of edges " + graph.edges.count)
}
}
这是.sbt文件:
name := "Transaction"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.3.1" % "provided"
resolvers ++= Seq(
"Akka Repository" at "http://repo.akka.io/releases/",
"Spray Repository" at "http://repo.spray.cc/")
答案 0 :(得分:1)
如果您在一行中输入命令,请删除" \",所以它将如下所示:
./bin/spark-submit "/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar" --help
" \"是Bash转义字符。如果您尝试输入长命令,可以将命令分成几行,例如:
./bin/spark-submit \
"/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar" --help
<强>更新强>: 我以前的回答只关注&#34; java.net.URISyntaxException&#34;。
要运行spark-submit,您可以参考其文档:https://spark.apache.org/docs/1.1.0/submitting-applications.html
对于您的情况,您可以在以下命令中执行您的jar文件(假设您的类名是 org.apache.spark.examples.SparkPi ):
./bin/spark-submit --class org.apache.spark.examples.SparkPi "/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar"
或将其拆分为多行:
./bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
"/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar"
您还可以指定您希望运行的核心数量(让我们说4个核心):
./bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master local[4] \
"/Users/Desktop/tranasactions/target/transaction_2.10-1.0.jar"
如果您不确定您的jar文件是否正常工作,我建议您在开始之前先使用spark-examples- [version] .jar。