CoderException:使用Jackson使用CustomCoder对Json值执行GroupByKey时的java.io.EOFException

时间:2016-06-23 16:05:21

标签: java jackson google-cloud-dataflow

为什么在执行以下代码时会出现此EOFException?

我在更简单的情况下成功使用了GroupByKey我认为似乎触发错误的是使用自定义编码器(对于Json对象)。任何人都可以解释为什么会这样吗?

这是错误:

com.google.cloud.dataflow.sdk.Pipeline$PipelineExecutionException: com.google.cloud.dataflow.sdk.coders.CoderException: java.io.EOFException

    at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:186)
    at com.google.cloud.dataflow.sdk.testing.TestPipeline.run(TestPipeline.java:106)
    at com.example.dataflow.TestGroupByKeyCustomCoder.testPipeline(TestGroupByKeyCustomCoder.java:85)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:119)
    at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:42)
    at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:234)
    at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:74)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: com.google.cloud.dataflow.sdk.coders.CoderException: java.io.EOFException
    at com.google.cloud.dataflow.sdk.coders.BigEndianLongCoder.decode(BigEndianLongCoder.java:62)
    at com.google.cloud.dataflow.sdk.coders.InstantCoder.decode(InstantCoder.java:83)
    at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:621)
    at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:553)
    at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:98)
    at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:42)
    at com.google.cloud.dataflow.sdk.util.CoderUtils.decodeFromSafeStream(CoderUtils.java:157)
    at com.google.cloud.dataflow.sdk.util.CoderUtils.decodeFromByteArray(CoderUtils.java:140)
    at com.google.cloud.dataflow.sdk.util.CoderUtils.decodeFromByteArray(CoderUtils.java:134)
    at com.google.cloud.dataflow.sdk.util.MutationDetectors$CodedValueMutationDetector.<init>(MutationDetectors.java:107)
    at com.google.cloud.dataflow.sdk.util.MutationDetectors.forValueWithCoder(MutationDetectors.java:44)
Caused by: java.io.EOFException
    at java.io.DataInputStream.readFully(DataInputStream.java:197)
    at java.io.DataInputStream.readLong(DataInputStream.java:416)
    at com.google.cloud.dataflow.sdk.coders.BigEndianLongCoder.decode(BigEndianLongCoder.java:58)
    at com.google.cloud.dataflow.sdk.coders.InstantCoder.decode(InstantCoder.java:83)
    at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:621)
    at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.decode(WindowedValue.java:553)
    at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:98)
    at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:42)
    at com.google.cloud.dataflow.sdk.util.CoderUtils.decodeFromSafeStream(CoderUtils.java:157)
    at com.google.cloud.dataflow.sdk.util.CoderUtils.decodeFromByteArray(CoderUtils.java:140)
    at com.google.cloud.dataflow.sdk.util.CoderUtils.decodeFromByteArray(CoderUtils.java:134)
    at com.google.cloud.dataflow.sdk.util.MutationDetectors$CodedValueMutationDetector.<init>(MutationDetectors.java:107)
    at com.google.cloud.dataflow.sdk.util.MutationDetectors.forValueWithCoder(MutationDetectors.java:44)
    at com.google.cloud.dataflow.sdk.transforms.ParDo$ImmutabilityCheckingOutputManager.output(ParDo.java:1303)
    at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.outputWindowedValue(DoFnRunnerBase.java:287)
    at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnProcessContext.output(DoFnRunnerBase.java:449)
    at com.google.cloud.dataflow.sdk.util.ReifyTimestampAndWindowsDoFn.processElement(ReifyTimestampAndWindowsDoFn.java:38)
    at com.google.cloud.dataflow.sdk.util.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:49)
    at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.processElement(DoFnRunnerBase.java:138)
    at com.google.cloud.dataflow.sdk.transforms.ParDo.evaluateHelper(ParDo.java:1229)
    at com.google.cloud.dataflow.sdk.transforms.ParDo.evaluateSingleHelper(ParDo.java:1098)
    at com.google.cloud.dataflow.sdk.transforms.ParDo.access$300(ParDo.java:457)
    at com.google.cloud.dataflow.sdk.transforms.ParDo$1.evaluate(ParDo.java:1084)
    at com.google.cloud.dataflow.sdk.transforms.ParDo$1.evaluate(ParDo.java:1079)
    at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:858)
    at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:219)
    at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:215)
    at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:215)
    at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:215)
    at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:102)
    at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:259)
    at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:814)
    at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:526)
    at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:96)
    at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:180)
    at com.google.cloud.dataflow.sdk.testing.TestPipeline.run(TestPipeline.java:106)
    at com.example.dataflow.TestGroupByKeyCustomCoder.testPipeline(TestGroupByKeyCustomCoder.java:85)

以下是代码:

package com.example.dataflow;

import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.cloud.dataflow.sdk.coders.CustomCoder;
import com.google.cloud.dataflow.sdk.testing.CoderProperties;
import com.google.cloud.dataflow.sdk.testing.TestPipeline;
import com.google.cloud.dataflow.sdk.transforms.*;
import com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow;
import com.google.cloud.dataflow.sdk.transforms.windowing.PaneInfo;
import com.google.cloud.dataflow.sdk.util.WindowedValue;
import org.joda.time.Instant;
import org.junit.Assert;
import org.junit.Test;

import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;


class ParseJson extends DoFn<String, JsonNode> {

    private static final long serialVersionUID = 1L;
    private transient ObjectMapper om;

    { init(); }

    private void init() {
        om = new ObjectMapper();
    }

    private void readObject(java.io.ObjectInputStream in)
            throws IOException, ClassNotFoundException {
        init();
    }

    @Override
    public void processElement(ProcessContext c) throws Exception {
        JsonNode node = om.readTree(c.element());
        c.output(node);
    }
}

class JsonNodeCoder extends CustomCoder<JsonNode> {

    private static final long serialVersionUID = 1L;

    private ObjectMapper mapper = new ObjectMapper();

    private static final JsonNodeCoder INSTANCE = new JsonNodeCoder();

    public static JsonNodeCoder of() {
        return INSTANCE;
    }

    @Override
    public void encode(JsonNode value, OutputStream outStream, Context context) throws IOException {
        mapper.configure(JsonGenerator.Feature.AUTO_CLOSE_TARGET, false).writeValue(outStream, value);
    }

    @Override
    public JsonNode decode(InputStream inStream, Context context) throws IOException {
        return mapper.configure(JsonParser.Feature.AUTO_CLOSE_SOURCE, false).readTree(inStream);
    }
}

public class TestGroupByKeyCustomCoder {

    @Test // original code the produces the error
    public void testPipeline() throws IOException {

        TestPipeline p = TestPipeline.create();

        p.getCoderRegistry().registerCoder(JsonNode.class, JsonNodeCoder.class);

        p.apply(Create.of("{}"))
                .apply(ParDo.of(new ParseJson()))
                .apply(WithKeys.of("foo"))
                .apply("GroupByAction", GroupByKey.create());

        p.run();
    }

    // Test as per Kenn Knowles' suggestion
    // this throws the same error
    @Test
    public void testCustomCoder() throws Exception {
        ObjectMapper mapper = new ObjectMapper();
        JsonNode value = mapper.readTree("{}");

        WindowedValue.FullWindowedValueCoder<JsonNode> windowedValueCoder
                = WindowedValue.FullWindowedValueCoder
                    .of(JsonNodeCoder.of(), GlobalWindow.Coder.INSTANCE);

        WindowedValue<JsonNode> x = WindowedValue.of(
                value, Instant.now(), GlobalWindow.INSTANCE, PaneInfo.ON_TIME_AND_ONLY_FIRING);
        CoderProperties.coderDecodeEncodeEqual(windowedValueCoder, x);
    }
}

此问题似乎是由于readTree消耗了太多输入,从而吞噬了Dataflow正在寻找的时间戳:

@Test
public void testJackson() throws IOException {
    ObjectMapper mapper = new ObjectMapper();
    ByteArrayInputStream bis = new ByteArrayInputStream("{}1".getBytes());
    mapper.readTree(bis);
    Assert.assertNotEquals(bis.read(), -1); // assertion fails
}

1 个答案:

答案 0 :(得分:2)

堆栈跟踪指示在解析时间戳的大端long时到达文件的末尾。

encoding used by WindowedValue.FullWindowedValueCoder是您的编码值,后跟时间戳,后跟窗口,最后是窗格元数据。所以这反过来意味着JsonCoder从输入流中消耗了太多的字节(可能全部都是?)所以时间戳的解码会到达文件的末尾。

SDK提供了许多用于测试CoderProperties中编码器的实用程序。实际上,您可以通过使用编码器WindowedValue.FullWindowedValueCoder.of(JsonCoder.of(), new GlobalWindow.Coder())运行CoderProperties#coderDecodeEncodeEqual来直接测试此案例(位于全局窗口中)。

有一个标记传递给encodedecode,您可能需要注意:Coder.Context

  • Coder.Context.OUTER表示您的编码器位于最外层Coder并拥有整个流。在这种情况下,在编码时,您可以利用EOF信号并省去诸如长度前缀或括号之类的元数据,并且在解码时可以根据需要使用它。
  • Coder.Context.NESTED表示您的Coder仅编码值的一部分,因此需要编写足够的元数据,以便智能地仅消耗其自身编码中的字节。