我正在尝试编写一个使用 azure-cosmosdb-spark 连接到CosmosDB的Spark应用程序。但是,即使我使用正确版本的Spark和Scala,我仍然会遇到依赖冲突。 我设法在具有相同Spark版本的Azure Databricks集群上使用连接器,所以我对这里的问题有点迷失。
我已阅读这些帖子(Developing scala spark app that connect to azure CosmosDB& Spark libraries conflect when cosmosdb Lib),但仍无法解决我的问题。
以下是尝试使用连接器的SBT配置部分:
sparkVersion in ThisBuild := "2.2.0" // I also tried "2.2.1"
sparkComponents in ThisBuild += "mllib"
spIgnoreProvided in ThisBuild := true
scalaVersion in ThisBuild := "2.11.12"
parallelExecution in ThisBuild := false
scalacOptions in Compile ++= Seq("-implicits", "-feature")
lazy val root = (project in file("."))
.aggregate(shaker, ...)
.settings(Publish.notPublished: _*)
lazy val shaker = project
.settings(name := "project-name")
.settings(libraryDependencies += "com.github.pureconfig" %% "pureconfig" % "0.9.0")
.settings(libraryDependencies += "com.github.scopt" %% "scopt" % "3.7.0")
.settings(libraryDependencies += "com.microsoft.azure" % "azure-cosmosdb-spark_2.2.0_2.11" % "1.1.1")
.settings(scalacOptions += "-Xmacro-settings:materialize-derivations")
运行SBT时出现以下错误:
[error] (shaker/*:update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-core, org.apache.spark:spark-network-common
感谢您的帮助!