如何在使用Hadoop时解决NoClassDefFoundError?

时间:2016-04-27 06:18:55

标签: java hadoop noclassdeffounderror camus

我正在

  

线程中的异常" main" java.lang.NoClassDefFoundError:com / linkedin / camus / etl / IEtlKey。

运行命令时:

hadoop jar camus-etl-kafka-0.1.0-SNAPSHOT.jar 
com.linkedin.camus.etl.kafka.CamusJob -P camus.properties

我收到以下例外情况。

2016-04-27 11:34:04.622 java[13567:351959] Unable to load realm mapping info from SCDynamicStore
[NativeCodeLoader] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: com/linkedin/camus/etl/IEtlKey
    at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:252)
    at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:235)
    at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:691)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:646)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.linkedin.camus.etl.IEtlKey

我在类路径中包含了camus-example-0.1.0-SNAPSHOT-shaded.jar

如果我错过了什么,请告诉我。

先谢谢

Soumyajit

1 个答案:

答案 0 :(得分:0)

您应该尝试在此LinkedIn's previous generation Kafka to HDFS pipeline页面中添加camus-api,因为此包中包含缺少的类,您可以看到here

注意加缪可能需要的其他传递依赖。

此外,为了确保在使用hadoop jar命令行时在类路径中找到类,可以添加libjars命令行选项,如{{3}中所述}:

$ export LIBJARS=/path/jar1,/path/jar2
$ hadoop jar my-example.jar com.example.MyTool -libjars ${LIBJARS} -mytoolopt value

知道Camus将被Gobblin取代可能会有用:

  

加缪正逐步退出并被戈布林取代。对于那些使用或   对加缪感兴趣,我们建议看看戈布林。

     

有关从Camus迁移到Using the libjars option with Hadoop的说明,请参阅   看看Gobblin