Hive shell在执行查询时抛出Filenotfound异常,尽管使用" ADD JAR"添加jar文件。

时间:2015-06-04 15:14:45

标签: java hadoop hive hdfs hiveql

1)我使用" ADD JAR /home/hduser/softwares/hive/hive-serdes-1.0-SNAPSHOT.jar添加了serde jar文件;"

2)创建表

3)表格创建成功

4)但是当我执行任何选择查询时,它会抛出找不到文件的异常

hive> select count(*) from tab_tweets;

Query ID = hduser_20150604145353_51b4def4-11fb-4638-acac-77301c1c1806
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
java.io.FileNotFoundException: File does not exist: hdfs://node1:9000/home/hduser/softwares/hive/hive-serdes-1.0-SNAPSHOT.jar
    at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
    at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:99)
    at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:269)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:390)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:483)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
    at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:428)
    at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1638)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1397)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1183)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1039)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

作业提交失败,异常&#39; java.io.FileNotFoundException(文件不存在:hdfs:// node1:9000 / home / hduser / softwares / hive / hive-serdes-1.0-SNAPSHOT.jar)& #39; FAILED:执行错误,从org.apache.hadoop.hive.ql.exec.mr.MapRedTask返回代码1

3 个答案:

答案 0 :(得分:2)

方法1:hive-serdes-1.0-SNAPSHOT.jar文件从本地文件系统复制到HDFS。

hadoop fs -mkdir /home/hduser/softwares/hive/
hadoop fs -put /home/hduser/softwares/hive/hive-serdes-1.0-SNAPSHOT.jar /home/hduser/softwares/hive/
  

注意:如果您使用的是最新的hadoop版本,请使用 hdfs dfs 而不是 hadoop fs

方法2:hive.aux.jars.pathhive-site.xml的值更改为:

<property>
 <name>hive.aux.jars.path</name>
 <value>file:///home/hduser/softwares/hive/hive-serdes-1.0-SNAPSHOT.jar</value>
</property>

方法3:在hadoop类路径中添加hive-serdes-1.0-SNAPSHOT.jar。即,在hadoop-env.sh中添加此行:

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/home/hduser/softwares/hive/hive-serdes-1.0-SNAPSHOT.jar
  

注意:我已经提到了在 / home / hduser / softwares / hive 中安装了配置单元的路径。如果你安装了蜂巢   在其他地方,请更改 / home / hduser / softwares / hive 指向   您的配置单元安装文件夹。

答案 1 :(得分:0)

检查jar存在于/home/hduser/softwares/hive/hive-serdes-1.0-SNAPSHOT.jar

答案 2 :(得分:0)

注意:无需将hive-serdes-1.0-SNAPSHOT.jar复制到hdfs中,将其保存在本地Fs中。

在Query执行时。 Hive将负责将其作为D.C

在所有节点中提供

有关详细信息,请参阅此链接:official link

仅供参考 - 请参阅Hive资源

将资源添加到会话后,Hive查询可以通过其名称引用它(在map / reduce / transform子句中) 并且资源在整个Hadoop集群的执行时在本地可用。 Hive使用Hadoop的分布式缓存在查询执行时将添加的资源分发到集群中的所有计算机

您可以以多种方式添加其他Jars:

  • 目前的蜂巢会话
  

hive&gt;添加jar /local/fs/path/to/your/file.jar

     

hive&gt;列表罐子// - 检查

  • 在.hiverc中添加运行配置单元的节点,例如.bashrc

    cd $ HOME

    创建一个文件.hiverc

    cat $ HOME / .hiverc

    添加jar /local/fs/path/to/your/file.jar //添加此行

  • 将jar文件添加到hive-site.xml

     hive.aux.jars.path  文件:///home/user/path/to/your/hive-serdes-1.0-SNAPSHOT.jar