sbt未解决的spark-cassandra-connector 2.0.2的依赖关系

时间:2017-06-09 15:39:25

标签: scala apache-spark sbt spark-cassandra-connector

build.sbt:

val sparkVersion = "2.1.1";

libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;

输出:

[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found

有什么想法吗?我是sbt和spark的新手。感谢

1 个答案:

答案 0 :(得分:1)

这是由"com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";没有 scala版本引起的,请参阅 maven repo

http://search.maven.org/#artifactdetails%7Ccom.datastax.spark%7Cspark-cassandra-connector_2.11%7C2.0.2%7Cjar

有两个解决方案:

  1. "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2"明确设置依赖
  2. Scala版本
  3. "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2",将%%工件ID 一起使用,这样, SBT 会自动根据您项目的 scala版本< / strong>扩展为解决方案1 ​​