使用 debezium-json 格式的 Flink 1.11

时间:2021-03-31 15:07:06

标签: apache-flink flink-sql

在 Flink 1.11 中,我正在尝试 debezium-format 并且以下应该可以工作,对吗?我正在尝试遵循文档 [1]

    TableResult products = bsTableEnv.executeSql(
            "CREATE TABLE products (\n" +
                    "  id BIGINT,\n" +
                    "  name STRING,\n" +
                    "  description STRING,\n" +
                    "  weight DECIMAL(10, 2)\n" +
                    ") WITH (\n" +
                    " 'connector' = 'kafka',\n" +
                    " 'topic' = 'dbserver1.inventory.products',\n" +
                    " 'properties.bootstrap.servers' = 'localhost:9092',\n" +
                    " 'properties.group.id' = 'testGroup',\n" +
                    "'scan.startup.mode'='earliest-offset',\n" +
                    " 'format' = 'debezium-json'" +
                    ")"
    );

    bsTableEnv.executeSql("SHOW TABLES").print(); // This seems to work; 
    bsTableEnv.executeSql("SELECT id FROM products").print();

输出片段/异常:

+------------+
| table name |
+------------+
|   products |
+------------+
1 row in set
Exception in thread "main" org.apache.flink.table.api.TableException: AppendStreamTableSink doesn't support consuming update and delete changes which is produced by node TableSourceScan(table=[[default_catalog, default_database, products]], fields=[id, name, description, weight])

我已验证 Debezium 设置,dbserver1.inventory.products 主题中有消息。我可以使用其他方法在 Flink 中读取 Kafka 主题,但如前所述,我希望 debezium-json 格式能够正常工作。

另外,我知道 Flink 1.12 引入了新的 Kafka Upsert 连接器,但我现在只能使用 1.11。

我对 Flink 还很陌生,所以我完全有可能在这里遗漏了一些明显的东西。

提前致谢

[1] https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/connectors/formats/debezium.html

1 个答案:

答案 0 :(得分:0)

似乎问得太早了。如果它可能对其他人有帮助,我可以让它与它一起工作

Table results = bsTableEnv.sqlQuery("SELECT id, name FROM products");
bsTableEnv.toRetractStream(results, Row.class).print();