解决spark-avro错误=无法加载数据源类:com.databricks.spark.avro

时间:2015-05-13 23:25:16

标签: scala intellij-idea apache-spark avro

我正在尝试使用spark-avro库来处理avro文件。我正在使用SBT:

build.sbt:

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-sql" % "1.3.0",
  "com.databricks" %% "spark-avro" % "1.0.0")

tester.scala:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._
import com.databricks.spark.avro._

object tester {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("SimpleApplication").setMaster("local")
    val sc = new SparkContext(conf)

    // Creates a DataFrame from a specified file
    val df = sqlContext.load("episodes.avro", "com.databricks.spark.avro")
  }
}

当我在IntelliJ IDE中运行测试程序时,我得到以下堆栈跟踪:

Exception in thread "main" java.lang.NoClassDefFoundError:        org/apache/avro/mapred/FsInput
    at com.databricks.spark.avro.AvroRelation.newReader(AvroRelation.scala:111)
    at com.databricks.spark.avro.AvroRelation.<init>(AvroRelation.scala:53)
    at com.databricks.spark.avro.DefaultSource.createRelation(DefaultSource.scala:41)
    at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:290)

我跑的时候:

$ sbt package
$ ~/spark-1.3.1/bin/spark-submit --class "tester" target/scala-2.10/project_2.10-0.1-SNAPSHOT.jar

我得到以下堆栈跟踪:

Exception in thread "main" java.lang.RuntimeException: Failed to load class for data source: com.databricks.spark.avro
    at scala.sys.package$.error(package.scala:27)
    at org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:194)
    at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:205)
    at org.apache.spark.sql.SQLContext.load(SQLContext.scala:697)

我该怎么做才能解决此错误?任何帮助是极大的赞赏。谢谢!

2 个答案:

答案 0 :(得分:0)

“sbt package”不会包含您的依赖项,请改为sbt-assembly

答案 1 :(得分:0)

我将build.sbt文件更改为:

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-sql" % "1.3.0",
  "com.databricks" %% "spark-avro" % "1.0.0",
  "org.apache.avro" % "avro" % "1.7.7",
  "org.apache.avro" % "avro-mapred" % "1.7.7")

assemblyMergeStrategy in assembly := {
  case PathList("org", "slf4j", xs@_*) => MergeStrategy.first
  case PathList("org", "apache", "spark", xs @_*) => MergeStrategy.first
  case PathList("com", "esotericsoftware", "minlog", xs @_*) => MergeStrategy.first
  case PathList("javax", "activation", xs @_*) => MergeStrategy.first
  case PathList("javax", "servlet", xs @_*) => MergeStrategy.first
  case PathList("javax", "xml", "stream", xs @_*) => MergeStrategy.first
  case PathList("org", "apache", "commons", xs @_*) => MergeStrategy.first
  case PathList("com", "google", "common", xs @_*) => MergeStrategy.first
  case "org/apache/hadoop/yarn/factories/package-info.class" => MergeStrategy.first
  case "org/apache/hadoop/yarn/factory/providers/package-info.class" => MergeStrategy.first
  case "org/apache/hadoop/yarn/util/package-info.class" => MergeStrategy.first
  case x if x.startsWith("META-INF") => MergeStrategy.discard
  case x if x.startsWith("plugin.properties") => MergeStrategy.discard
  case x => {
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
  }
}

并使用命令

$ sbt assembly

构建jar。现在一切都有效。