将数据从PostgreSQL提取到Druid时发生类型转换错误

时间:2019-07-10 09:43:26

标签: java druid

我正在尝试使用Firehose将数据从PostgreSQL提取到Druid。

我已经在conf文件中添加了druid.extensions.loadList=["postgresql-metadata-storage"],但是任务抛出失败

  

java.lang.ClassCastException:无法将java.util.LinkedHashMap强制转换为java.nio.ByteBuffer

摄入规范文件

{
  "type": "index",
  "spec": {
      "dataSchema": {
          "dataSource": "dataset_1007",
          "parser": {
              "type": "string",
              "parseSpec": {
                  "format": "tsv",
                  "columns": [
                      "id",
                      "name",
                      "datetimecol"
                  ],
                  "timestampSpec": {
                      "column": "datetimecol",
                      "format": "auto"
                  },
                  "dimensionsSpec": {
                      "dimensions": [
                          "id",
                          "name",
                          "datetimecol"
                      ]
                  }
              }
          },
          "granularitySpec": {
              "type": "uniform",
              "segmentGranularity": "DAY",
              "queryGranularity": "NONE",
              "rollup": false
          }
      },
      "ioConfig": {
          "type": "index",
          "firehose": {
              "type": "sql",
              "database": {
                  "type": "postgresql",
                  "connectorConfig": {
                      "connectURI": "jdbc:postgresql://ISVDRDBILXP1/testdb",
                      "user": "druid",
                      "password": "druid"
                  }
              },
              "sqls": [
                  "SELECT id,name,datetimecol FROM public.testtable"
              ]
          },
          "appendToExisting": false
      },
      "tuningConfig": {
          "forceExtendableShardSpecs": true,
          "type": "index"
      }
  }
}

排查哪个表列造成此问题真的很困难,我将所有列类型都更改为varchar()。请指出我是否在任何地方出错。

更新

Stacktrace:

2019-07-10T09:44:10,476 INFO [firehose_fetch_0] org.apache.druid.data.input.impl.prefetch.Fetcher - Fetching [0]th object[SELECT id,name,datetimecol FROM public.testtable], fetchedBytes[0]
2019-07-10T09:44:10,528 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.CustomExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-10T09:44:10,530 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.ForbiddenExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-10T09:44:10,530 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding org.apache.druid.server.initialization.jetty.BadRequestExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-10T09:44:10,531 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-10T09:44:10,538 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider to GuiceManagedComponentProvider with the scope "Singleton"
2019-07-10T09:44:10,636 INFO [task-runner-0-priority-0] org.apache.druid.data.input.impl.prefetch.CacheManager - Object[SELECT id,name,datetimecol FROM public.testtable] is cached. Current cached bytes is [188]
2019-07-10T09:44:10,648 ERROR [task-runner-0-priority-0] org.apache.druid.indexing.common.task.IndexTask - Encountered exception in DETERMINE_PARTITIONS.
java.lang.ClassCastException: java.util.LinkedHashMap cannot be cast to java.nio.ByteBuffer
    at org.apache.druid.segment.transform.TransformingStringInputRowParser.parseBatch(TransformingStringInputRowParser.java:31) ~[druid-processing-0.15.0-incubating.jar:0.15.0-incubating]
    at org.apache.druid.data.input.impl.SqlFirehose.nextRow(SqlFirehose.java:68) ~[druid-core-0.15.0-incubating.jar:0.15.0-incubating]
    at org.apache.druid.indexing.common.task.IndexTask.collectIntervalsAndShardSpecs(IndexTask.java:744) ~[druid-indexing-service-0.15.0-incubating.jar:0.15.0-incubating]
    at org.apache.druid.indexing.common.task.IndexTask.createShardSpecsFromInput(IndexTask.java:671) ~[druid-indexing-service-0.15.0-incubating.jar:0.15.0-incubating]
    at org.apache.druid.indexing.common.task.IndexTask.determineShardSpecs(IndexTask.java:606) ~[druid-indexing-service-0.15.0-incubating.jar:0.15.0-incubating]
    at org.apache.druid.indexing.common.task.IndexTask.run(IndexTask.java:437) [druid-indexing-service-0.15.0-incubating.jar:0.15.0-incubating]
    at org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:419) [druid-indexing-service-0.15.0-incubating.jar:0.15.0-incubating]
    at org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:391) [druid-indexing-service-0.15.0-incubating.jar:0.15.0-incubating]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_212]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_212]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_212]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]

1 个答案:

答案 0 :(得分:0)

如果有人在寻找答案。从SQL提取数据时,我们必须使用map解析器。这是我正在使用的更新规范。

  "parser": {
    "type" : "map",
    "parseSpec": {
      "format": "timeAndDims",
      "dimensionsSpec": {
        "dimensions": [
          "dim1",
          "dim2",
          "dim3"
        ]
      },
      "timestampSpec": {
        "format": "auto",
        "column": "ts"
      }
    }
  }