FileUtil.copy在HA模式下工作hdfs

时间:2015-08-13 11:16:19

标签: hadoop

我在HA模式下将文件从本地复制到hdfs,但它提供以下异常:

java.net.UnknownHostException: unknown host: nameservices
at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:850)
at org.apache.hadoop.ipc.Client.call(Client.java:720)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at com.sun.proxy.$Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)

我的HDFS写代码是:

        Path srcPath = new Path(
        LookupFileHelper.getLocalFileLocation(section));
        LOGGER.info("Uploading srcPath " + srcPath.toUri());
        FileSystem srcFS = FileSystem.get(srcPath.toUri(), conf);
        LOGGER.info("Uploading srcFS name" + srcFS.getName());
        LOGGER.info("Uploading srcFS " + srcFS);

        Path dstPath = new Path(section.getDestLocationHdfs());
        LOGGER.info("Uploading dstPath " + section.getDestLocationHdfs());
        URI uri = URI.create(section.getDestLocationHdfs());
        LOGGER.info("Uploading dstPath " + uri);
        FileSystem dstFS = FileSystem.get(uri, conf);
        LOGGER.info("Uploading dstPath " + dstFS);

        FileUtil.copy(srcFS, srcPath, dstFS, dstPath, false, conf);

0 个答案:

没有答案