线程" main"中的例外情况java.lang.ClassNotFoundException:org.spark_project.protobuf.GeneratedMessage

时间:2016-10-04 10:55:18

标签: java eclipse maven apache-spark

当我尝试运行spark应用程序时,我收到以下异常。 我正在使用eclipse maven项目。请有人建议为什么会发生此异常。 pom.xml中是否缺少任何依赖项,或者还有其他问题? 它在代码中没有显示错误。

Exception in thread "main" java.lang.ClassNotFoundException: org.spark_project.protobuf.GeneratedMessage
        at java.net.URLClassLoader.findClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Unknown Source)
        at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(ReflectiveDynamicAccess.scala:21)
        at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(ReflectiveDynamicAccess.scala:20)
        at scala.util.Try$.apply(Try.scala:191)
        at akka.actor.ReflectiveDynamicAccess.getClassFor(ReflectiveDynamicAccess.scala:20)
        at akka.serialization.Serialization$$anonfun$6.apply(Serialization.scala:265)
        at akka.serialization.Serialization$$anonfun$6.apply(Serialization.scala:264)
        at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:728)
        at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)
        at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
        at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:727)
        at akka.serialization.Serialization.<init>(Serialization.scala:264)
        at akka.serialization.SerializationExtension$.createExtension(SerializationExtension.scala:15)
        at akka.serialization.SerializationExtension$.createExtension(SerializationExtension.scala:12)
        at akka.actor.ActorSystemImpl.registerExtension(ActorSystem.scala:745)
        at akka.actor.ExtensionId$class.apply(Extension.scala:79)
        at akka.serialization.SerializationExtension$.apply(SerializationExtension.scala:12)
        at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:175)
        at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:656)
        at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:653)
        at akka.actor.ActorSystemImpl._start(ActorSystem.scala:653)
        at akka.actor.ActorSystemImpl.start(ActorSystem.scala:669)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
        at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1832)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1823)
        at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:270)
        at org.test.scala.test1.main(test1.java:13)

我的代码如下: -

package org.test.scala;

import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.SQLContext;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;



public class test1 {
  public static void main(String[] args) {
    //  SparkConf sparkConf = new   SparkConf().setAppName("JavaSparkSQL").setMaster("local");
      SparkContext sc=new SparkContext(new SparkConf().setAppName("sql").setMaster("local"));
      SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
        DataFrame lines = sqlContext.jsonFile("sample.json");
         System.out.println(lines.toString());
        lines.registerTempTable("lines");
  }
}

请提出一些建议。 这是我的pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>org.test1</groupId>
  <artifactId>spark3</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>spark3</name>
  <url>http://maven.apache.org</url>
  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>

  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
     <dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.11.2</version>
    <scope>provided</scope>
  </dependency>
  <dependency>
    <groupId>com.typesafe.akka</groupId>
    <artifactId>akka-actor_2.11</artifactId>
    <version>2.4.11</version>
</dependency>
<dependency>
    <groupId>org.spark-project.protobuf</groupId>
    <artifactId>protobuf-java</artifactId>
    <version>2.4.1-shaded</version>
</dependency>

     <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>3.0.0-alpha1</version>
</dependency>
  <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>1.3.0</version>  
  </dependency>
  <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>1.3.0</version>
</dependency>

  <dependency>
    <groupId>jdk.tools</groupId>
    <artifactId>jdk.tools</artifactId>
    <scope>system</scope>
    <version>1.8</version>
    <systemPath>${env.JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
  </dependencies>
</project>

1 个答案:

答案 0 :(得分:0)

听起来你错过了或者库有冲突。向我们展示你的maven或其他图书馆配置。

GeneratedMessage也是一个protobuf类,它倾向于在编译时构建。您是否尝试过运行“生成源和更新文件夹”?