如何使用Java Api在Hadoop中上传文件

时间:2018-06-18 14:00:00

标签: java hadoop hadoop2

import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import java.net.URI;

public class HdfsCon 
{
    public static void main(String[] args) throws Exception      
    {        
        String HdfsUrl="hdfs://172.16.32.139:9000";
        Configuration conf=new Configuration();
        //conf.set("fs.default.name", HdfsUrl);
        URI uri=new URI(HdfsUrl);

        FileSystem fs= FileSystem.get(URI.create(HdfsUrl),conf);

        String fp="/home/user1/Documents/hive-site.xml";
        String tp="/fifo_tbl";

        fs.copyFromLocalFile(new Path(fp),new Path(tp));    
    }
}

当我尝试执行上面的java程序时,它给了我以下错误。 我正在尝试通过java代码上传hadoop中的文件。 (Hadoop版本2.9.0)

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
at org.apache.hadoop.ipc.Client.call(Client.java:1113)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

0 个答案:

没有答案