如何使用Java在hdfs中执行hadoop put文件?这有可能吗?
使用此声明:
class ObjEx
{
public static void main(String arg[])
{
ObjEx ob=new ObjEx();
Object o = ob;
if(o==ob)
System.out.println("1");
if(o!=ob)
System.out.println("10");
if(o.equals(ob))
System.out.println("101");
if(ob.equals(o))
System.out.println("1101");
}
}
谢谢!
答案 0 :(得分:2)
试试这个:
//Source file in the local file system
String localSrc = args[0];
//Destination file in HDFS
String dst = args[1];
//Input stream for the file in local file system to be written to HDFS
InputStream in = new BufferedInputStream(new FileInputStream(localSrc));
//Get configuration of Hadoop system
Configuration conf = new Configuration();
System.out.println("Connecting to -- "+conf.get("fs.defaultFS"));
//Destination file in HDFS
FileSystem fs = FileSystem.get(URI.create(dst), conf);
OutputStream out = fs.create(new Path(dst));
//Copy file from local to HDFS
IOUtils.copyBytes(in, out, 4096, true);
答案 1 :(得分:1)
您应该可以使用copyFromLocalFile
:
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
Path localPath = new Path("path/to/local/file");
Path hdfsPath = new Path("/path/in/hdfs");
fs.copyFromLocalFile(localPath, hdfsPath);