Google Dataflow:ZipInputStream的编码器

时间:2016-10-28 09:57:55

标签: google-cloud-dataflow

我正在尝试将一些文件(它们本身包含zip文件)从谷歌存储解压缩到谷歌存储。

因此,我有以下DoFn来收集ZipInputStreams:

  static class UnzipFilesFN extends DoFn<GcsPath,ZipInputStream>{

private static final long serialVersionUID = 7373250969860890761L;
public void processElement(ProcessContext c){
     GcsPath p = c.element();
     try{
         ZipInputStream zis = new ZipInputStream(new FileInputStream(p.toString()));
         c.output(zis);

     }
     catch (FileNotFoundException fnfe){
         //
     }
  }

}

以下自定义接收器用于解压缩和写入部分:

public static class ZipIO{    
  public static class Sink extends com.google.cloud.dataflow.sdk.io.Sink<ZipInputStream> {

    private static final long serialVersionUID = -7414200726778377175L;
    final String unzipTarget;

      public Sink withDestinationPath(String s){
         if(s!=""){
             return new Sink(s);
         }
         else {
             throw new IllegalArgumentException("must assign destination path");
         }

      }

      protected Sink(String path){
          this.unzipTarget = path;
      }

      @Override
      public void validate(PipelineOptions po){
          if(unzipTarget==null){
              throw new RuntimeException();
          }
      } 

      @Override
      public ZipFileWriteOperation createWriteOperation(PipelineOptions po){
          return new ZipFileWriteOperation(this);
      }

  }

  private static class ZipFileWriteOperation extends WriteOperation<ZipInputStream, UnzipResult>{

    private static final long serialVersionUID = 7976541367499831605L;
    private final ZipIO.Sink sink;

      public ZipFileWriteOperation(ZipIO.Sink sink){
          this.sink = sink;
      }



      @Override
      public void initialize(PipelineOptions po) throws Exception{

      }

      @Override
      public void finalize(Iterable<UnzipResult> writerResults, PipelineOptions po) throws Exception {
         long totalFiles = 0;
         for(UnzipResult r:writerResults){
             totalFiles +=r.filesUnziped;
         }
         LOG.info("Unzipped {} Files",totalFiles);
      }  

      @Override
      public ZipIO.Sink getSink(){
          return sink;
      }

      @Override
      public ZipWriter createWriter(PipelineOptions po) throws Exception{
          return new ZipWriter(this);
      }

  }

  private static class ZipWriter extends Writer<ZipInputStream, UnzipResult>{
      private final ZipFileWriteOperation writeOp;
      private long totalUnzipped = 0;

      ZipWriter(ZipFileWriteOperation writeOp){
          this.writeOp = writeOp;
      }

      @Override
      public void open(String uID) throws Exception{
      }

      @Override
      public void write(ZipInputStream zis){
            byte[] buffer = new byte[1024];
            try{
                ZipEntry ze = zis.getNextEntry();
                while(ze!=null){
                    File f = new File(writeOp.sink.unzipTarget + "/" + ze.getName());
                    FileOutputStream fos = new FileOutputStream(f);
                    int len;
                    while((len=zis.read(buffer))>0){
                        fos.write(buffer, 0, len);
                    }
                    fos.close();
                    this.totalUnzipped++;
                }
                zis.closeEntry();
                zis.close();
            }
            catch(Exception e){
                //
            }

      }

      @Override
      public UnzipResult close() throws Exception{
          return new UnzipResult(this.totalUnzipped);
      }

      @Override
      public ZipFileWriteOperation getWriteOperation(){
          return writeOp;
      }


  }

  private static class UnzipResult implements Serializable{  
    private static final long serialVersionUID = -8504626439217544799L;
    final long filesUnziped;      
      public UnzipResult(long filesUnziped){
          this.filesUnziped=filesUnziped;
      }
  }
}

}

当我尝试运行管道时,我遇到了一些错误:

  

从后备CoderProvider构建编码器失败:无法为类型java.util.zip.ZipInputStream提供编码器:com.google.cloud.dataflow.sdk.coders.protobuf.ProtoCoder$1@5717c37无法为类型提供编码器java.util.zip.ZipInputStream:无法提供ProtoCoder,因为java.util.zip.ZipInputStream不是com.google.protobuf.Message的子类; com.google.cloud.dataflow.sdk.coders.SerializableCoder$1@68f4865无法为类型java.util.zip.ZipInputStream提供编码器:无法提供SerializableCoder,因为java.util.zip.ZipInputStream未实现Serializable。       在com.google.cloud.dataflow.sdk.values.TypedPValue.inferCoderOrFail(TypedPValue.java:195)       在com.google.cloud.dataflow.sdk.values.TypedPValue.getCoder(TypedPValue.java:48)       在com.google.cloud.dataflow.sdk.values.PCollection.getCoder(PCollection.java:137)       在com.google.cloud.dataflow.sdk.values.TypedPValue.finishSpecifying(TypedPValue.java:88)       在com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:332)       在com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:291)       在com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:174)

我需要分配哪个编码器来处理ZipInputStreams?

谢谢&amp; BR 菲利普

1 个答案:

答案 0 :(得分:0)

编码器是必要的,以便跑步者可以将PCollection实现为临时存储并将其读回,而不是将其保存在内存中。我无法想出实现ZipInputStream对象的合理方法 - 这是一个基本的概念问题,而不是Coder API问题。

但是,在您的特定情况下,我认为您只需在ZipInputStream函数中打开ZipWriter.write(),并将ZipIO.Sink设为Sink<GcsPath>而不是{{1} }}

我在您的代码中注意到的另一件事:我认为您计划将此代码与位于GCS和Cloud Dataflow运行器上的文件一起使用,而不是仅使用内存中运行程序和本地文件。在这种情况下,Sink<ZipInputStream>将无法透明地处理对GCS的读/写 - 您需要使用GcsUtil