build.sbt:
val sparkVersion = "2.1.1";
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;
输出:
[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found
有什么想法吗?我是sbt和spark的新手。感谢
答案 0 :(得分:1)
这是由"com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
没有 scala版本引起的,请参阅 maven repo :
有两个解决方案:
"com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2"
明确设置依赖"com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2"
,将%%
与工件ID 一起使用,这样, SBT 会自动根据您项目的 scala版本< / strong>扩展为解决方案1 。