错误:scalac:对类文件'HBaseContext.class'中遇到的org.apache.spark.Logging的错误符号引用

时间:2017-06-18 10:57:09

标签: spring scala apache-spark hbase

我正在尝试将HBase与Spark 2.1.0和Scala 2.11.2一起使用。

尝试运行Spring应用程序后,出现错误:

  

错误:scalac:对org.apache.spark.Logging的错误符号引用   在类文件'HBaseContext.class'中遇到。无法访问类型   登录包org.apache.spark。当前的类路径可能是   缺少org.apache.spark.Logging的定义,或   HBaseContext.class可能是根据版本编译的   与当前类路径中找到的不兼容。

build.sbt

import sbt.ExclusionRule

name := "..."
version := "1.0"
scalaVersion := "2.11.2"

val elasticVersion = "5.4.1"

resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"

/* Dependencies */
libraryDependencies ++= Seq(
  // Framework and configuration
  "org.springframework.boot" % "spring-boot-starter-web" % "1.5.3.RELEASE",
  "org.springframework.cloud" % "spring-cloud-starter-config" % "1.3.1.RELEASE",
  "org.hibernate" % "hibernate-validator" % "5.2.4.Final",

  /* Dependencies */
  "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7",
  "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7",
  "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.7",

  // Spark
  "org.apache.spark" %% "spark-core" % "2.1.0",
  "org.apache.spark" %% "spark-sql" % "2.1.0",
  "org.apache.spark" %% "spark-mllib" % "2.1.0",
  "graphframes" % "graphframes" % "0.5.0-spark2.1-s_2.11",

  //JDBC
  "mysql" % "mysql-connector-java" % "5.1.35",

  // Persistence
  "org.elasticsearch" % "elasticsearch-spark-20_2.11" % elasticVersion,
  "org.mongodb.spark" % "mongo-spark-connector_2.11" % "2.0.0",
  "org.apache.hbase" % "hbase-spark" % "2.0.0-alpha-1"


).map(_.excludeAll(ExclusionRule("org.slf4j", "slf4j-log4j12"), ExclusionRule("log4j", "log4j")))


/* Assembly     */
assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs@_*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

0 个答案:

没有答案