kafka-python 1.4.6 的Python 3.7.3 debezium / kafka 0.9
我正在使用kafka-python设置其使用者配置(如其文档),但没有收到任何消息,我的日志级别是DEBUG,运行脚本时,我得到以下输出:
我不知道那是错的。
我尝试添加更多配置参数,例如request_timeout_ms
,enable_auto_commit=False
等。
...
test_1 | 2019-07-10 15:34:04,651: kafka.conn - DEBUG: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <disconnected> [unspecified None]>: creating new socket
test_1 | 2019-07-10 15:34:04,651: kafka.conn - DEBUG: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <disconnected> [IPv4 ('172.30.0.3', 9092)]>: setting socket option (6, 1, 1)
test_1 | 2019-07-10 15:34:04,651: kafka.conn - INFO: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <connecting> [IPv4 ('172.30.0.3', 9092)]>: connecting to kafka:9092 [('172.30.0.3', 9092) IPv4]
test_1 | 2019-07-10 15:34:04,652: kafka.conn - DEBUG: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <connecting> [IPv4 ('172.30.0.3', 9092)]>: established TCP connection
test_1 | 2019-07-10 15:34:04,653: kafka.conn - INFO: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <connecting> [IPv4 ('172.30.0.3', 9092)]>: Connection complete.
test_1 | 2019-07-10 15:34:04,653: kafka.client - DEBUG: Node bootstrap-0 connected
test_1 | 2019-07-10 15:34:04,653: kafka.client - DEBUG: Sending metadata request MetadataRequest_v1(topics=['dbhistory.beta-api.profile']) to node bootstrap-0
test_1 | 2019-07-10 15:34:04,653: kafka.protocol.parser - DEBUG: Sending request MetadataRequest_v1(topics=['dbhistory.beta-api.profile'])
test_1 | 2019-07-10 15:34:04,653: kafka.conn - DEBUG: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <connected> [IPv4 ('172.30.0.3', 9092)]> Request 1: MetadataRequest_v1(topics=['dbhistory.beta-api.profile'])
test_1 | 2019-07-10 15:34:05,167: kafka.protocol.parser - DEBUG: Received correlation id: 1
test_1 | 2019-07-10 15:34:05,167: kafka.protocol.parser - DEBUG: Processing response MetadataResponse_v1
test_1 | 2019-07-10 15:34:05,169: kafka.conn - DEBUG: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <connected> [IPv4 ('172.30.0.3', 9092)]> Response 1 (515.169620513916 ms): MetadataResponse_v1(brokers=[(node_id=1, host='172.30.0.3', port=9092, rack=None)], controller_id=1, topics=[(error_code=5, topic='dbhistory.beta-api.profile', is_internal=False, partitions=[])])
test_1 | 2019-07-10 15:34:05,271: kafka.client - DEBUG: Sending metadata request MetadataRequest_v1(topics=['dbhistory.beta-api.profile']) to node bootstrap-0
test_1 | 2019-07-10 15:34:05,272: kafka.protocol.parser - DEBUG: Sending request MetadataRequest_v1(topics=['dbhistory.beta-api.profile'])
test_1 | 2019-07-10 15:34:05,274: kafka.conn - DEBUG: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <connected> [IPv4 ('172.30.0.3', 9092)]> Request 2: MetadataRequest_v1(topics=['dbhistory.beta-api.profile'])
test_1 | 2019-07-10 15:34:05,281: kafka.protocol.parser - DEBUG: Received correlation id: 2
test_1 | 2019-07-10 15:34:05,281: kafka.protocol.parser - DEBUG: Processing response MetadataResponse_v1
test_1 | 2019-07-10 15:34:05,282: kafka.conn - DEBUG: <BrokerConnection node_id=bootstrap-0 host=kafka:9092 <connected> [IPv4 ('172.30.0.3', 9092)]> Response 2 (7.951021194458008 ms): MetadataResponse_v1(brokers=[(node_id=1, host='172.30.0.3', port=9092, rack=None)], controller_id=1, topics=[(error_code=5, topic='dbhistory.beta-api.profile', is_internal=False, partitions=[])])
test_1 | 2019-07-10 15:34:05,386: kafka.client - DEBUG: Sending metadata request MetadataRequest_v1(topics=['dbhistory.beta-api.profile']) to node bootstrap-0
...
test_1 | 2019-07-10 15:34:05,815: kafka.conn - DEBUG: <BrokerConnection node_id=1 host=172.30.0.3:9092 <connected> [IPv4 ('172.30.0.3', 9092)]> Request 2: FetchRequest_v2(replica_id=-1, max_wait_time=500, min_bytes=1, topics=[(topic='dbhistory.beta-api.profile', partitions=[(partition=0, offset=0, max_bytes=1048576)])])
test_1 | 2019-07-10 15:34:06,326: kafka.protocol.parser - DEBUG: Received correlation id: 2
test_1 | 2019-07-10 15:34:06,326: kafka.protocol.parser - DEBUG: Processing response FetchResponse_v2
test_1 | 2019-07-10 15:34:06,327: kafka.conn - DEBUG: <BrokerConnection node_id=1 host=172.30.0.3:9092 <connected> [IPv4 ('172.30.0.3', 9092)]> Response 2 (511.2500190734863 ms): FetchResponse_v2(throttle_time_ms=0, topics=[(topics='dbhistory.beta-api.profile', partitions=[(partition=0, error_code=0, highwater_offset=0, message_set=b'')])])
test_1 | 2019-07-10 15:34:06,330: kafka.metrics.metrics - DEBUG: Added sensor with name topic.dbhistory.beta-api.profile.bytes-fetched
我的python使用者是这个:
import consumer
import logging
def index():
consum = consumer.Consumer()
consum.run("dbhistory.beta-api.profile")
if __name__ == '__main__':
logging.basicConfig(
format='%(asctime)s: %(name)s - %(levelname)s: %(message)s',
level=logging.DEBUG
)
index()
from kafka import KafkaConsumer
class Consumer:
def run(self, topic):
topic = topic
consumer = KafkaConsumer(
topic,
bootstrap_servers=['kafka:9092'],
api_version=(0, 10, 0),
auto_offset_reset='earliest',
)
print("Topic: {}".format(topic))
print("### Getting messages...")
print("Consumer")
print(consumer)
for msg in consumer:
print("#### Message: ####")
print(msg)
print("#### EndMessage ####")
version: '2'
services:
zookeeper:
image: debezium/zookeeper:0.9
hostname: zookeeper
ports:
- "2181:2181"
- "2888:2888"
- "3888:3888"
kafka:
image: debezium/kafka:0.9
ports:
- "9092:9092"
- "29092:29092"
environment:
KAFKA_BROKER: kafka:9092
ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
links:
- zookeeper
connect:
image: debezium/connect:0.9
environment:
GROUP_ID: 1
CONFIG_STORAGE_TOPIC: my_connect_configs
OFFSET_STORAGE_TOPIC: my_connect_offsets
STATUS_STORAGE_TOPIC: my_connect_status
BOOTSTRAP_SERVERS: kafka:9092
PLUGIN_PATH: "/plugins"
ports:
- "8083:8083"
test:
build: .
command: python3 main.py
ports:
- 5000:5000
我希望消息的输出包含我数据库的表信息。