我使用pubsub中的“export to bigquery”功能,通过dataflow将常规JSON传递给pubsub中的bigquery。
然而它工作了一秒钟,这意味着一些条目正确地通过bigquery。但现在我在数据流日志上遇到错误
java.lang.RuntimeException:java.io.IOException:插入失败: [{“errors”:[{“debugInfo”:“”,“location”:“_ comments”,“message”:“没有这样的 场 “” 理由。 “:” 无效 “}],” 指数“:0}] org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn.flushRows(StreamingWriteFn.java:131) org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn.finishBundle(StreamingWriteFn.java:97) 引起:java.io.IOException:插入失败: [{“errors”:[{“debugInfo”:“”,“location”:“_ comments”,“message”:“没有这样的 场 “” 理由。 “:” 无效 “}],” 指数“:0}]
......很多线......
org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl $ DatasetServiceImpl.insertAll(BigQueryServicesImpl.java:811) org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn.flushRows(StreamingWriteFn.java:127) org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn.finishBundle(StreamingWriteFn.java:97) org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteFn $ DoFnInvoker.invokeFinishBundle(未知 资源) org.apache.beam.runners.core.SimpleDoFnRunner.finishBundle(SimpleDoFnRunner.java:187) com.google.cloud.dataflow.worker.SimpleParDoFn.finishBundle(SimpleParDoFn.java:407) com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.finish(ParDoOperation.java:60) com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:76) com.google.cloud.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1069) com.google.cloud.dataflow.worker.StreamingDataflowWorker.access $ 1000(StreamingDataflowWorker.java:133) com.google.cloud.dataflow.worker.StreamingDataflowWorker $ 8.run(StreamingDataflowWorker.java:841) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor中的$ Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:745)