我是Java的新手,特别是Java中的面向对象编程,我不断收到这个让我疯狂的错误。
以下是我的员工超类的一部分:
16/01/30 20:45:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/01/30 20:45:16 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
16/01/30 20:45:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
16/01/30 20:45:16 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
16/01/30 20:45:16 INFO input.FileInputFormat: Total input paths to process : 1
16/01/30 20:45:16 INFO input.FileInputFormat: Total input paths to process : 1
16/01/30 20:45:16 INFO mapreduce.JobSubmitter: number of splits:3
16/01/30 20:45:17 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local1787244132_0001
16/01/30 20:45:17 INFO mapred.LocalDistributedCacheManager: Creating symlink: /app/hadoop/tmp/mapred/local/1454204717451/joda-time-2.9.1-no-tzdb.jar <- /usr/local/hadoop/sbin/joda-time-2.9.1-no-tzdb.jar
16/01/30 20:45:17 INFO mapred.LocalDistributedCacheManager: Localized hdfs://localhost:54310/user/hduser/lib/joda-time-2.9.1-no-tzdb.jar as file:/app/hadoop/tmp/mapred/local/1454204717451/joda-time-2.9.1-no-tzdb.jar
16/01/30 20:45:17 INFO mapred.LocalDistributedCacheManager: Creating symlink: /app/hadoop/tmp/mapred/local/1454204717452/joda-time-2.9.1.jar <- /usr/local/hadoop/sbin/joda-time-2.9.1.jar
16/01/30 20:45:17 INFO mapred.LocalDistributedCacheManager: Localized hdfs://localhost:54310/user/hduser/lib/joda-time-2.9.1.jar as file:/app/hadoop/tmp/mapred/local/1454204717452/joda-time-2.9.1.jar
16/01/30 20:45:17 INFO mapred.LocalDistributedCacheManager: file:/app/hadoop/tmp/mapred/local/1454204717451/joda-time-2.9.1-no-tzdb.jar
16/01/30 20:45:17 INFO mapred.LocalDistributedCacheManager: file:/app/hadoop/tmp/mapred/local/1454204717452/joda-time-2.9.1.jar
16/01/30 20:45:17 INFO mapreduce.Job: The url to track the job: http://localhost:8080/
16/01/30 20:45:17 INFO mapreduce.Job: Running job: job_local1787244132_0001
16/01/30 20:45:17 INFO mapred.LocalJobRunner: OutputCommitter set in config null
16/01/30 20:45:18 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
16/01/30 20:45:18 INFO mapred.LocalJobRunner: Waiting for map tasks
16/01/30 20:45:18 INFO mapred.LocalJobRunner: Starting task: attempt_local1787244132_0001_m_000000_0
16/01/30 20:45:18 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
16/01/30 20:45:18 INFO mapred.MapTask: Processing split: hdfs://localhost:54310/user/hduser/input/sentimentFeedback7.csv:0+143748596
16/01/30 20:45:18 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
16/01/30 20:45:18 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
16/01/30 20:45:18 INFO mapred.MapTask: soft limit at 83886080
16/01/30 20:45:18 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
16/01/30 20:45:18 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
16/01/30 20:45:18 INFO mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
16/01/30 20:45:18 INFO mapred.MapTask: Starting flush of map output
16/01/30 20:45:18 INFO mapred.LocalJobRunner: Starting task: attempt_local1787244132_0001_m_000001_0
16/01/30 20:45:18 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
16/01/30 20:45:18 INFO mapred.MapTask: Processing split: hdfs://localhost:54310/user/hduser/input/allergyConsumption7.csv:0+134217728
16/01/30 20:45:18 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
16/01/30 20:45:18 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
16/01/30 20:45:18 INFO mapred.MapTask: soft limit at 83886080
16/01/30 20:45:18 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
16/01/30 20:45:18 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
16/01/30 20:45:18 INFO mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
16/01/30 20:45:18 INFO mapred.MapTask: Starting flush of map output
16/01/30 20:45:18 INFO mapred.LocalJobRunner: Starting task: attempt_local1787244132_0001_m_000002_0
16/01/30 20:45:18 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
16/01/30 20:45:18 INFO mapred.MapTask: Processing split: hdfs://localhost:54310/user/hduser/input/allergyConsumption7.csv:134217728+105486421
16/01/30 20:45:18 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
16/01/30 20:45:18 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
16/01/30 20:45:18 INFO mapred.MapTask: soft limit at 83886080
16/01/30 20:45:18 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
16/01/30 20:45:18 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
16/01/30 20:45:18 INFO mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
16/01/30 20:45:18 INFO mapred.MapTask: Starting flush of map output
16/01/30 20:45:18 INFO mapred.LocalJobRunner: map task executor complete.
16/01/30 20:45:18 WARN mapred.LocalJobRunner: job_local1787244132_0001
java.lang.Exception: java.lang.NoClassDefFoundError: org/joda/time/format/DateTimeFormat
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: java.lang.NoClassDefFoundError: org/joda/time/format/DateTimeFormat
at org.peach.fooddiary.FoodDiaryMR$Map1.map(FoodDiaryMR.java:45)
at org.peach.fooddiary.FoodDiaryMR$Map1.map(FoodDiaryMR.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.joda.time.format.DateTimeFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 12 more
16/01/30 20:45:18 INFO mapreduce.Job: Job job_local1787244132_0001 running in uber mode : false
16/01/30 20:45:18 INFO mapreduce.Job: map 0% reduce 0%
16/01/30 20:45:18 INFO mapreduce.Job: Job job_local1787244132_0001 failed with state FAILED due to: NA
16/01/30 20:45:19 INFO mapreduce.Job: Counters: 0
和子类CommissionEmployee:
public Employee(String firstName, String lastName, String socialSecurityNumber) {
this.firstName = firstName;
this.lastName = lastName;
this.socialSecurityNumber = socialSecurityNumber;
}
private final String firstName;
private final String lastName;
private final String socialSecurityNumber;
public String getFirstName() {
return firstName;
}
public String getLastName() {
return lastName;
}
public String getSocialSecurityNumber() {
return socialSecurityNumber;
}
我得到的同样错误是:
public CommissionEmployee(String firstName, String lastName,
String socialSecurityNumber, double grossSales,
double commissionRate)
{
this.firstName = firstName;
this.lastName = lastName;
this.socialSecurityNumber = socialSecurityNumber;
this.grossSales = grossSales;
this.commissionRate = commissionRate;
}
这究竟是什么意思,原因是什么,以及如何解决?
答案 0 :(得分:5)
由于CommissionEmployee
是Employee
的子类,因此Java要求在构造Employee
时调用CommissionEmployee
的构造函数。默认情况下,这是对无参数构造函数Employee()
的调用,在您的情况下不存在。
您有两种选择:
调用已存在的超类构造函数:
public CommissionEmployee(String firstName, String lastName,
String socialSecurityNumber, double grossSales,
double commissionRate)
{
super(firstName, lastName, socialSecuityNumber);
this.grossSales = grossSales;
this.commissionRate = commissionRate;
}
创建一个no-arg超类构造函数,并在子类构造函数中设置第一个/最后一个名称和SSN(不太理想)。超类(Employee)构造函数受到保护,因此除了子类之外不能调用它,如下所示:
protected Employee() {}