我正在尝试使用自定义Coder
以便我可以进行一些转换,但是我无法让PCollection
使用我的自定义编码器,而我怀疑< / em>(???)这是因为它被包裹在KV
中。具体做法是:
Pipeline p = Pipeline.create ...
p.getCoderRegistry().registerCoder(MyClass.class, MyClassCoder.class);
...
PCollection<String> input = ...
PCollection<KV<String, MyClass>> t = input.apply(new ToKVTransform());
当我尝试运行这样的东西时,我得到一个java.lang.ClassCastException和一个包含SerializableCoder
而不是MyClassCoder
的堆栈跟踪,就像我期望的那样。
[error] at com.google.cloud.dataflow.sdk.coders.SerializableCoder.decode(SerializableCoder.java:133)
[error] at com.google.cloud.dataflow.sdk.coders.SerializableCoder.decode(SerializableCoder.java:50)
[error] at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:95)
[error] at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:42)
我看到另一个有点相关的问题(Using TextIO.Write with a complicated PCollection type in Google Cloud Dataflow)的答案是将所有内容映射到字符串,并使用它来传递PCollections周围的东西。是真的推荐的方式??
(注意:实际代码是在Scala中,但我很确定它不是Scala&lt; =&gt; Java问题所以我在这里将它翻译成Java。)
更新以包含Scala代码和更多背景信息:
所以这是实际的异常本身(应该在开头包括这个):
java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field com.example.schema.Schema.keyTypes of type scala.collection.immutable.Map in instance of com.example.schema.Schema
com.example.schema.Schema
的位置:
case class Schema(id: String, keyTypes: Map[String, Type])
最后,SchemaCoder
是:
class SchemaCoder extends com.google.cloud.dataflow.sdk.coders.CustomCoder[Schema] {
def decode(inputStream: InputStream, context: Context): Schema = {
val ois = new ObjectInputStream(inputStream)
val id: String = ois.readObject().asInstanceOf[String]
val javaMap: java.util.Map[String, Type] = ois.readObject().asInstanceOf[java.util.Map[String, Type]]
ois.close()
Schema(id, javaMap.asScala.toMap)
}
def encode(schema: Schema, outputStream: OutputStream, context: Context): Unit = {
val baos = new ByteArrayOutputStream()
val oos = new ObjectOutputStream(baos)
oos.writeObject(schema.id)
val javaMap: java.util.Map[String, Type] = schema.keyTypes.asJava
oos.writeObject(javaMap)
oos.close()
val encoded = new String(Base64.encodeBase64(baos.toByteArray()))
outputStream.write(encoded.getBytes())
}
}
====
Edit2:这是ToKVTransform
实际上的样子:
class SchemaExtractorTransform extends PTransform[PCollection[String], PCollection[Schema]] {
class InferSchemaFromStringWithKeyFn extends DoFn[String, KV[String, Schema]] {
override def processElement(c: DoFn[String, KV[String, Schema]]#ProcessContext): Unit = {
val line = c.element()
inferSchemaFromString(line)
}
}
class GetFirstFn extends DoFn[KV[String, java.lang.Iterable[Schema]], Schema] {
override def processElement(c: DoFn[KV[String, java.lang.Iterable[Schema]], Schema]#ProcessContext): Unit = {
val idAndSchemas: KV[String, java.lang.Iterable[Schema]] = c.element()
val it: java.util.Iterator[Schema] = idAndSchemas.getValue().iterator()
c.output(it.next())
}
}
override def apply(inputLines: PCollection[String]): PCollection[Schema] = {
val schemasWithKey: PCollection[KV[String, Schema]] = inputLines.apply(
ParDo.named("InferSchemas").of(new InferSchemaFromStringWithKeyFn())
)
val keyed: PCollection[KV[String, java.lang.Iterable[Schema]]] = schemasWithKey.apply(
GroupByKey.create()
)
val schemasOnly: PCollection[Schema] = keyed.apply(
ParDo.named("GetFirst").of(new GetFirstFn())
)
schemasOnly
}
}
答案 0 :(得分:2)
这个问题在Java中没有重现; Scala对打破Dataflow编码器推断的类型做了不同的处理。要解决此问题,您可以在PCollection上调用setCoder以显式设置其Coder,例如
schemasWithKey.setCoder(KvCoder.of(StringUtf8Coder.of(), SchemaCoder.of());
这是您的代码的Java版本,只是为了确保它做的大致相同:
public static class SchemaExtractorTransform
extends PTransform<PCollection<String>, PCollection<Schema>> {
class InferSchemaFromStringWithKeyFn extends DoFn<String, KV<String, Schema>> {
public void processElement(ProcessContext c) {
c.output(KV.of(c.element(), new Schema()));
}
}
class GetFirstFn extends DoFn<KV<String, java.lang.Iterable<Schema>>, Schema> {
private static final long serialVersionUID = 0;
public void processElement(ProcessContext c) {
c.output(c.element().getValue().iterator().next());
}
}
public PCollection<Schema> apply(PCollection<String> inputLines) {
PCollection<KV<String, Schema>> schemasWithKey = inputLines.apply(
ParDo.named("InferSchemas").of(new InferSchemaFromStringWithKeyFn()));
PCollection<KV<String, java.lang.Iterable<Schema>>> keyed =
schemasWithKey.apply(GroupByKey.<String, Schema>create());
PCollection<Schema> schemasOnly =
keyed.apply(ParDo.named("GetFirst").of(new GetFirstFn()));
return schemasOnly;
}
}