Scala - spark-corenlp - java.lang.ClassNotFoundException

时间:2016-06-22 21:59:54

标签: scala apache-spark stanford-nlp

我想运行spark-coreNLP example,但在运行spark-submit时遇到java.lang.ClassNotFoundException错误。

这是scala代码,来自github示例,我将其放入一个对象,并定义了一个SparkContext。

analyzer.Sentiment.scala:

package analyzer
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql.functions._
import com.databricks.spark.corenlp.functions._
import sqlContext.implicits._

object Sentiment {
  def main(args: Array[String]) {

    val conf = new SparkConf().setAppName("Sentiment")
    val sc = new SparkContext(conf)

        val input = Seq(
                (1, "<xml>Stanford University is located in California. It is a great university.</xml>")
                ).toDF("id", "text")

        val output = input
            .select(cleanxml('text).as('doc))
            .select(explode(ssplit('doc)).as('sen))
            .select('sen, tokenize('sen).as('words), ner('sen).as('nerTags), sentiment('sen).as('sentiment))

            output.show(truncate = false)
    }
}

我正在使用spark-coreNLP提供的build.sbt - 我只修改了scalaVersion和sparkVerison。

version := "1.0"

scalaVersion := "2.11.8"

initialize := {
  val _ = initialize.value
  val required = VersionNumber("1.8")
  val current = VersionNumber(sys.props("java.specification.version"))
  assert(VersionNumber.Strict.isCompatible(current, required), s"Java $required required.")
}

sparkVersion := "1.5.2"

// change the value below to change the directory where your zip artifact will be created
spDistDirectory := target.value

sparkComponents += "mllib"

spName := "databricks/spark-corenlp"

licenses := Seq("GPL-3.0" -> url("http://opensource.org/licenses/GPL-3.0"))

resolvers += Resolver.mavenLocal

libraryDependencies ++= Seq(
  "edu.stanford.nlp" % "stanford-corenlp" % "3.6.0",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.6.0" classifier "models",
  "com.google.protobuf" % "protobuf-java" % "2.6.1"
)

然后,我通过无问题地运行来创建我的jar。

sbt package

最后,我将工作提交给Spark:

spark-submit --class "analyzer.Sentiment" --master local[4] target/scala-2.11/sentimentanalizer_2.11-0.1-SNAPSHOT.jar 

但是我收到以下错误:

java.lang.ClassNotFoundException: analyzer.Sentiment
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:641)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我的文件Sentiment.scala位于一个名为&#34; analyzer&#34;的软件包中。

    $ find .
    ./src
    ./src/analyzer
    ./src/analyzer/Sentiment.scala
    ./src/com
    ./src/com/databricks
    ./src/com/databricks/spark
    ./src/com/databricks/spark/corenlp
    ./src/com/databricks/spark/corenlp/CoreNLP.scala
    ./src/com/databricks/spark/corenlp/functions.scala
    ./src/com/databricks/spark/corenlp/StanfordCoreNLPWrapper.scala

当我从Spark Quick Start运行SimpleApp示例时,我注意到MySimpleProject / bin /包含一个SimpleApp.class。 MySentimentProject / bin为空。所以我试图清理我的项目(我使用Eclipse for Scala)。

我认为这是因为我需要生成Sentiment.class,但我不知道该怎么做 - 它是使用SimpleApp.scala自动完成的,当它使用Eclipse Scala运行/构建时,它崩溃了。

1 个答案:

答案 0 :(得分:1)

也许你应该尝试添加

scalaSource in Compile := baseDirectory.value / "src"

build.sbt,因为sbt document读取“包含主要Scala源的目录默认为src/main/scala”。

或者只是在这个结构中制作源代码

$ find .
./src
./src/main
./src/main/scala
./src/main/scala/analyzer
./src/main/scala/analyzer/Sentiment.scala
./src/main/scala/com
./src/main/scala/com/databricks
./src/main/scala/com/databricks/spark
./src/main/scala/com/databricks/spark/corenlp
./src/main/scala/com/databricks/spark/corenlp/CoreNLP.scala
./src/main/scala/com/databricks/spark/corenlp/functions.scala
./src/main/scala/com/databricks/spark/corenlp/StanfordCoreNLPWrapper.scala