由于java.lang.NoClassDefFoundError:org / postgresql / Driver,无法运行spark app

时间:2014-10-06 10:46:45

标签: java postgresql apache-spark

我不能运行mt Spark app,因为java.lang.NoClassDefFoundError:org / postgresql / Driver

我做了同样的事情How can I connect to a postgreSQL database into Apache Spark using scala?但是当我尝试启动我的应用时,我得到了这个例外。

Exception in thread "main" java.lang.NoClassDefFoundError: org/postgresql/Driver
    at SparkRecommendationMatrix.<init>(SparkRecommendationMatrix.scala:31)
    at Main$.main(Main.scala:26)
    at Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.postgresql.Driver
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 10 more

我的bild.sbt文件:

name := "untitled12"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= Seq(
  "org.postgresql" % "postgresql" % "9.2-1003-jdbc4",
  "org.apache.spark" % "spark-mllib_2.10" % "1.0.0"
)

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

和我的java代码

val classes = Seq(
    getClass, // To get the jar with our own code.
    classOf[org.postgresql.Driver] // To get the connector.
  )

  val jars = classes.map(_.getProtectionDomain().getCodeSource().getLocation().getPath())

  // set up environment
  val conf = new SparkConf().setAppName(name).setJars(jars)
  //.setMaster("spark://192.168.10.122:7077")
  val sc = new SparkContext(conf)

1 个答案:

答案 0 :(得分:0)

我遇到了类似的问题。通过传递postgresql.jar作为spark-submit:

的参数解决了这个问题
spark-submit --class <<class>> --jars /full_path_to/postgressql.jar My_File.jar