Java:从FTP下载.Zip文件并提取内容而不保存本地系统上的文件

时间:2016-02-29 07:34:37

标签: java hadoop ftp zip

我有一个要求,我需要从FTP服务器下载某些.Zip文件,并将存档的内容(内容是一些XML文件)推送到 HDFS(Hadoop分布式文件系统)。因此,截至目前,我正在使用 acpache FTPClient 连接到FTP服务器并首先将文件下载到我的本地计算机。稍后解压缩相同并给出一个方法的文件夹路径,该方法将迭代本地文件夹并将文件推送到HDFS。为了便于理解,我还附上了一些代码片段。

 //Gives me an active FTPClient
    FTPClient ftpCilent = getActiveFTPConnection();
    ftpCilent.changeWorkingDirectory(remoteDirectory);

    FTPFile[] ftpFiles = ftpCilent.listFiles();
    if(ftpFiles.length <= 0){
    logger.info("Unable to find any files in given location!!");
    return;
    }
    //Iterate files
    for(FTPFile eachFTPFile : ftpFiles){
        String ftpFileName = eachFTPFile.getName();

        //Skips files if not .zip files
        if(!ftpFileName.endsWith(".zip")){
           continue;
        }

    System.out.println("Reading File -->" + ftpFileName);
    /*
     * location is the path on local system given by user
     * usually loaded by a property file.
     *
     * Create a archiveLocation where archived files are
     * downloaded from FTP.
     */
    String archiveFileLocation = location + File.separator + ftpFileName;
    String localDirName = ftpFileName.replaceAll(".zip", "");
    /*
     * localDirLocation is the location where a folder is created
     * by the name of the archive in the FTP and the files are copied to
     * respective folders.
     *
     */
    String localDirLocation = location + File.separator + localDirName;
    File localDir = new File(localDirLocation);
    localDir.mkdir();

    File archiveFile = new File(archiveFileLocation);

    FileOutputStream archiveFileOutputStream = new FileOutputStream(archiveFile);

    ftpCilent.retrieveFile(ftpFileName, archiveFileOutputStream);
    archiveFileOutputStream.close();

    //Delete the archive file after coping it's contents
    FileUtils.forceDeleteOnExit(archiveFile);

    //Read the archive file from archiveFileLocation.       
    ZipFile zip = new ZipFile(archiveFileLocation);
    Enumeration entries = zip.entries();

    while(entries.hasMoreElements()){
    ZipEntry entry = (ZipEntry)entries.nextElement();

    if(entry.isDirectory()){
        logger.info("Extracting directory " + entry.getName());
        (new File(entry.getName())).mkdir();
        continue;
    }

    logger.info("Extracting File: " + entry.getName());
    IOUtils.copy(zip.getInputStream(entry), new FileOutputStream(
    localDir.getAbsolutePath() + File.separator + entry.getName()));
    }

    zip.close();
   /*
    * Iterates the folder location provided and load the files to HDFS
    */    
    loadFilesToHDFS(localDirLocation);
    }
    disconnectFTP();

现在,这种方法的问题是,应用程序花了很多时间将文件下载到本地路径,解压缩然后将它们加载到HDFS。有没有更好的方法可以从FTP 动态中提取Zip的内容,并将内容流直接提供给方法loadFilesToHDFS()而不是本地系统的路径?

1 个答案:

答案 0 :(得分:0)

使用zip流。 看这里: http://www.oracle.com/technetwork/articles/java/compress-1565076.html

具体见那里的代码示例。