KafkLog4JAppender没有将应用程序日志推送到kafka主题

时间:2015-05-11 15:35:23

标签: log4j apache-kafka

我很擅长使用Kafka流。 在特定要求中,我必须将我的log4j日志直接推送到Kafka主题。

我有一个在centos上运行的独立kafka安装,我已经与kafka发布者和消费者客户验证了它。我也在使用捆绑的zookeeper实例。

现在我还创建了一个启用了log4j日志记录的独立Java应用程序。我也编辑了log4j.properties文件,如下所示 -

log4j.rootCategory=INFO
log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.file.File=/home/edureka/Desktop/Anurag/logMe
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n

log4j.logger.com=INFO,file,KAFKA

#Kafka Appender
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p  %C %m%n

log4j.appender.KAFKA.ProducerType=async
log4j.appender.KAFKA.BrokerList=localhost:2181
log4j.appender.KAFKA.Topic=test
log4j.appender.KAFKA.Serializer=kafka.test.AppenderStringSerializer 

现在,当我运行应用程序时,所有日志都将进入本地日志文件,但消费者仍未显示任何条目。 我正在使用的主题是在任一场景中进行测试。

同样没有生成错误日志,log4j库的详细日志如下 -

log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader@a1d92a.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@a1d92a class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader@a1d92a.
log4j: Using URL [file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties
log4j: Parsing for [root] with value=[DEBUG, stdout, file].
log4j: Level token is [DEBUG].
log4j: Category root set to DEBUG
log4j: Parsing appender named "stdout".
log4j: Parsing layout options for "stdout".
log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n].
log4j: End of parsing for "stdout".
log4j: Setting property [target] to [System.out].
log4j: Parsed "stdout" options.
log4j: Parsing appender named "file".
log4j: Parsing layout options for "file".
log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n].
log4j: End of parsing for "file".
log4j: Setting property [file] to [/home/edureka/Desktop/Anurag/logMe].
log4j: Setting property [maxBackupIndex] to [10].
log4j: Setting property [maxFileSize] to [5MB].
log4j: setFile called: /home/edureka/Desktop/Anurag/logMe, true
log4j: setFile ended
log4j: Parsed "file" options.
log4j: Finished configuring.
2015-05-11 19:44:40 DEBUG TestMe:19 - This is debug : anurag
2015-05-11 19:44:40 INFO  TestMe:23 - This is info : anurag
2015-05-11 19:44:40 WARN  TestMe:26 - This is warn : anurag
2015-05-11 19:44:40 ERROR TestMe:27 - This is error : anurag
2015-05-11 19:44:40 FATAL TestMe:28 - This is fatal : anurag
2015-05-11 19:44:40 INFO  TestMe:29 - message from log4j appender

任何帮助都会非常棒。 谢谢, AJ

1 个答案:

答案 0 :(得分:1)

在你的输出中,我没有看到正在创建的KAFKA appender,所以难怪Kafka没有记录任何内容。我猜测原因是你只从一个名为TestMe的类(可能在默认包中)登录,而KAFKA appender只被添加到名为" com"的记录器中。