尝试写入HDFS时出现奇怪的错误

时间:2015-04-08 10:00:29

标签: hdfs cloudera-cdh

当我们尝试写入HDFS时,我们发现了一个额外的和偶然的错误。 这是例外:

http://pastebin.com/3YDX4a39

这是代码的外观:

http://pastebin.com/h1RW07qv

当我尝试第一次实例化fs字段变量第12行时发生异常,当我尝试调用方法MyWatchService.saveInputDataIntoHDFS时,首先要做的是实例化MyHadoopUtils类的静态部分,< / p>

fs = FileSystem.get(myConf);

这引发了异常,但是在异常中我可以看到这条消息:

[INFO][FeedAdapter][2015-04-08 09:31:21] MyHadoopUtils:29 - HDFS instantiated! name: hdfs://dub-vcd-vms170.global.tektronix.net:8020
[INFO][FeedAdapter][2015-04-08 09:31:21] MyHadoopUtils:43 - HDFS fs instantiated? true

如何删除IOException?

我在linux环境中运行它,

    2.6.32-504.3.3.el6.centos.plus.x86_64, java version "1.7.0_71"
    OpenJDK Runtime Environment (rhel-2.5.3.2.el6_6-x86_64 u71-b14)
    OpenJDK 64-Bit Server VM (build 24.65-b04, mixed mode)

    <hadoop-hdfs.version>2.5.0-cdh5.2.0</hadoop-hdfs.version>
    <hadoop-common.version>2.5.0-cdh5.2.0</hadoop-common.version>

    <!-- necessary to write within HDFS -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoop-hdfs.version}</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop-common.version}</version>
    </dependency>

1 个答案:

答案 0 :(得分:0)

我回答自己,当我设置HADOOP_HOME变量时,问题就消失了

export HADOOP_HOME=/var/lib/hadoop-hdfs