我的s3存储桶中的表有6列,当我对列进行简单选择时,它会给出一个Null指针异常。
Select user from network_log limit 5;
它给出了以下错误:
Error during job, obtaining debugging information...
Job Tracking URL: http://ip-10-0-10-16.ap-southeast-1.compute.internal:50030/jobdetails.jsp?jobid=job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000094 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000050 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000034 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000000 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000012 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000023 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000038 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000047 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000052 (and more) from job job_201407140633_22716
Examining task ID: task_201407140633_22716_m_000063 (and more) from job job_201407140633_22716
任务ID: task_201407140633_22716_m_000034
URL:
此任务的诊断消息:
java.lang.NullPointerException
at org.apache.hadoop.fs.s3native.NativeS3FileSystem$NativeS3FsInputStream.close(NativeS3FileSystem.java:147)
at java.io.BufferedInputStream.close(BufferedInputStream.java:451)
at java.io.FilterInputStream.close(FilterInputStream.java:155)
at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)
at org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:254)
at org.apache.hadoop.hive.ql.io.RCFile$Reader.close(RCFile.java:1754)
at org.apache.hadoop.hive.ql.io.RCFileRecordReader.close(RCFileRecordReader.java:145)
at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doClose(CombineHiveRecordReader.java:72)
at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.close(HiveContextAwareRecordReader.java:96)
at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:344)
at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:251)
at org.apache.hadoop.mapred.MapTas
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask