我有一个大量使用Python日志记录模块的大型复杂应用程序。
我需要开始将这些日志记录到Kafka群集中,并且需要确保我不会在此过程中更改数据。
对我而言,理想的解决方案是为Kafka创建一个新的处理程序 - 并允许日志同时转到旧的日志记录解决方案和kafka。然后最终关闭旧的日志处理程序,然后发送给Kafka。
但是,我没有看到任何kafka日志处理程序 - 只有kafka客户端。添加kafka客户端意味着跟踪每个当前的日志记录调用并向新的kafka客户端添加单独的调用。获得相同的结果将很困难。
答案 0 :(得分:15)
处理程序实现非常简单。实际上,设置环境比实现处理程序花费的时间更多。
处理程序构造函数接受可选参数key
。如果已提供,则写入的消息将发送到此密钥指定的单个分区。如果未提供,则将以循环方式在服务器之间分发消息。
我没有对它进行过多次测试,但它很简单,我不知道这里可能出现什么问题。希望它会有用。
from kafka.client import KafkaClient
from kafka.producer import SimpleProducer,KeyedProducer
import logging,sys
class KafkaLoggingHandler(logging.Handler):
def __init__(self, host, port, topic, key=None):
logging.Handler.__init__(self)
self.kafka_client = KafkaClient(host, port)
self.key = key
if key is None:
self.producer = SimpleProducer(self.kafka_client, topic)
else:
self.producer = KeyedProducer(self.kafka_client, topic)
def emit(self, record):
#drop kafka logging to avoid infinite recursion
if record.name == 'kafka':
return
try:
#use default formatting
msg = self.format(record)
#produce message
if self.key is None:
self.producer.send_messages(msg)
else:
self.producer.send(self.key, msg)
except:
import traceback
ei = sys.exc_info()
traceback.print_exception(ei[0], ei[1], ei[2], None, sys.stderr)
del ei
def close(self):
self.producer.stop()
logging.Handler.close(self)
kh = KafkaLoggingHandler("localhost", 9092, "test_log")
#OR
#kh = KafkaLoggingHandler("localhost", 9092, "test_log", "key1")
logger = logging.getLogger("")
logger.setLevel(logging.DEBUG)
logger.addHandler(kh)
logger.info("The %s boxing wizards jump %s", 5, "quickly")
logger.debug("The quick brown %s jumps over the lazy %s", "fox", "dog")
try:
import math
math.exp(1000)
except:
logger.exception("Problem with %s", "math.exp")
P.S。处理程序使用此Kafka客户端:https://github.com/mumrah/kafka-python
答案 1 :(得分:3)
这是一个很棒的修复,谢谢!代码在过去几年中有所更新,现在已弃用某些功能。这个修复程序的整体设计非常非常有用,但再次感谢你。
SimpleProducer (deprecated) --> KafkaProducer
SimpleConsumer (deprecated) --> KafkaConsumer
这是我使用Kafka 1.0.0和kafka-python 1.4.2修改的片段,而且只是生产者,因为我在另一端通过logstash消费。
希望这适合你!
# -*- coding: utf-8 -*-
"""Module to test out logging to kafka."""
import json
import logging
from utils.kafka_handler import KafkaHandler
from kafka import KafkaProducer
def run_it(logger=None):
"""Run the actual connections."""
logger = logging.getLogger(__name__)
# enable the debug logger if you want to see ALL of the lines
#logging.basicConfig(level=logging.DEBUG)
logger.setLevel(logging.DEBUG)
kh = KafkaHandler(['localhost:9092'], 'sebtest')
logger.addHandler(kh)
logger.info("I'm a little logger, short and stout")
logger.debug("Don't tase me bro!")
if __name__ == "__main__":
run_it()
# -*- coding: utf-8 -*-
"""Module to provide kafka handlers for internal logging facility."""
import json
import logging
import sys
from kafka import KafkaProducer
class KafkaHandler(logging.Handler):
"""Class to instantiate the kafka logging facility."""
def __init__(self, hostlist, topic='corp_it_testing', tls=None):
"""Initialize an instance of the kafka handler."""
logging.Handler.__init__(self)
self.producer = KafkaProducer(bootstrap_servers=hostlist,
value_serializer=lambda v: json.dumps(v).encode('utf-8'),
linger_ms=10)
self.topic = topic
def emit(self, record):
"""Emit the provided record to the kafka_client producer."""
# drop kafka logging to avoid infinite recursion
if 'kafka.' in record.name:
return
try:
# apply the logger formatter
msg = self.format(record)
self.producer.send(self.topic, {'message': msg})
self.flush(timeout=1.0)
except Exception:
logging.Handler.handleError(self, record)
def flush(self, timeout=None):
"""Flush the objects."""
self.producer.flush(timeout=timeout)
def close(self):
"""Close the producer and clean up."""
self.acquire()
try:
if self.producer:
self.producer.close()
logging.Handler.close(self)
finally:
self.release()