Flink SQL Client 连接到安全的 kafka 集群

时间:2021-02-23 06:37:42

标签: apache-kafka apache-flink flink-sql

我想在由安全 kafka 集群的 kafka 主题支持的 Flink SQL 表上执行查询。我能够以编程方式执行查询,但无法通过 Flink SQL 客户端执行相同的操作。我不确定如何通过 Flink SQL 客户端传递 JAAS 配置 (java.security.auth.login.config) 和其他系统属性。

以编程方式进行 Flink SQL 查询

 private static void simpleExec_auth() {

        // Create the execution environment.
        final EnvironmentSettings settings = EnvironmentSettings.newInstance()
                .inStreamingMode()
                .withBuiltInCatalogName(
                        "default_catalog")
                .withBuiltInDatabaseName(
                        "default_database")
                .build();

        System.setProperty("java.security.auth.login.config","client_jaas.conf");
        System.setProperty("sun.security.jgss.native", "true");
        System.setProperty("sun.security.jgss.lib", "/usr/libexec/libgsswrap.so");
        System.setProperty("javax.security.auth.useSubjectCredsOnly","false");

        TableEnvironment tableEnvironment = TableEnvironment.create(settings);
        String createQuery = "CREATE TABLE  test_flink11 ( " + "`keyid` STRING, " + "`id` STRING, "
                + "`name` STRING, " + "`age` INT, " + "`color` STRING, " + "`rowtime` TIMESTAMP(3) METADATA FROM 'timestamp', " + "`proctime` AS PROCTIME(), " + "`address` STRING) " + "WITH ( "
                + "'connector' = 'kafka', "
                + "'topic' = 'test_flink10', "
                + "'scan.startup.mode' = 'latest-offset', "
                + "'properties.bootstrap.servers' = 'kafka01.nyc.com:9092', "
                + "'value.format' = 'avro-confluent', "
                + "'key.format' = 'avro-confluent', "
                + "'key.fields' = 'keyid', "
                + "'value.fields-include' = 'EXCEPT_KEY', "
                + "'properties.security.protocol' = 'SASL_PLAINTEXT', 'properties.sasl.kerberos.service.name' = 'kafka', 'properties.sasl.kerberos.kinit.cmd' = '/usr/local/bin/skinit --quiet', 'properties.sasl.mechanism' = 'GSSAPI', "
                + "'key.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', "
                + "'key.avro-confluent.schema-registry.subject' = 'test_flink6', "
                + "'value.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', "
                + "'value.avro-confluent.schema-registry.subject' = 'test_flink4')";
        System.out.println(createQuery);
        tableEnvironment.executeSql(createQuery);
        TableResult result = tableEnvironment
                .executeSql("SELECT name,rowtime FROM test_flink11");
        result.print();
    }

这工作正常。

通过 SQL 客户端进行 Flink SQL 查询

运行时出现以下错误。

Flink SQL> CREATE TABLE test_flink11 (`keyid` STRING,`id` STRING,`name` STRING,`address` STRING,`age` INT,`color` STRING) WITH('connector' = 'kafka', 'topic' = 'test_flink10','scan.startup.mode' = 'earliest-offset','properties.bootstrap.servers' = 'kafka01.nyc.com:9092','value.format' = 'avro-confluent','key.format' = 'avro-confluent','key.fields' = 'keyid', 'value.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', 'value.avro-confluent.schema-registry.subject' = 'test_flink4', 'value.fields-include' = 'EXCEPT_KEY', 'key.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', 'key.avro-confluent.schema-registry.subject' = 'test_flink6', 'properties.security.protocol' = 'SASL_PLAINTEXT', 'properties.sasl.kerberos.service.name' = 'kafka', 'properties.sasl.kerberos.kinit.cmd' = '/usr/local/bin/skinit --quiet', 'properties.sasl.mechanism' = 'GSSAPI');

Flink SQL> select * from test_flink11;
[ERROR] Could not execute SQL statement. Reason:
java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is /tmp/jaas-6309821891889949793.conf

/tmp/jaas-6309821891889949793.conf 中除了以下注释外没有任何内容

# We are using this file as an workaround for the Kafka and ZK SASL implementation
# since they explicitly look for java.security.auth.login.config property
# Please do not edit/delete this file - See FLINK-3929

SQL 客户端运行命令

bin/sql-client.sh embedded --jar  flink-sql-connector-kafka_2.11-1.12.0.jar  --jar flink-sql-avro-confluent-registry-1.12.0.jar

Flink 集群命令

bin/start-cluster.sh

如何为 SQL 客户端传递这个 java.security.auth.login.config 和其他系统属性(我在上面的 java 代码片段中设置的)?

1 个答案:

答案 0 :(得分:0)

flink-conf.yaml

security.kerberos.login.use-ticket-cache: true
security.kerberos.login.principal: XXXXX@HADOOP.COM
security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: /path/to/kafka.keytab
security.kerberos.login.principal: XXXX@HADOOP.COM
security.kerberos.login.contexts: Client,KafkaClient

我还没有真正测试过这个方案是否可行,你可以试试,希望对你有帮助。