我尝试使用log4j通过Python进行一些不错的日志记录。
我希望将所有日志写入数据库,只将错误写入error.log文件,并且只将信息写入info.log文件。
logger = sc._jvm.org.apache.log4j
lg = logger.LogManager.getRootLogger()
lg.info('test')
lg.error('test')
lg.debug('test')
lg.fatal('test')
我的log4j.properties文件如下:
# Set everything to be logged to the console
log4j.rootLogger=INFO, ria_info, ria_error, ria_mysql, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.out
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=[%p] %d %c %M - %m%n
# Set info logs to be written to info file
log4j.appender.ria_info=org.apache.log4j.RollingFileAppender
log4j.appender.ria_info.filter.RangeFilter=org.apache.log4j.varia.LevelMatchFilter
log4j.appender.ria_info.filter.RangeFilter.LevelToMatch=INFO
log4j.appender.ria_info.filter.RangeFilter.AcceptOnMatch=true
log4j.appender.ria_info.layout=org.apache.log4j.PatternLayout
log4j.appender.ria_info.layout.ConversionPattern=[%p] %d %c %M - %m%n
log4j.appender.ria_info.File=/home/data/logs/info.log
log4j.appender.ria_info.MaxFileSize=10MB
log4j.appender.ria_info.MaxBackupIndex=10
log4j.appender.ria_error=org.apache.log4j.RollingFileAppender
log4j.appender.ria_error.Append=false
log4j.appender.ria_error.filter.RangeFilter=org.apache.log4j.varia.LevelMatchFilter
log4j.appender.ria_error.filter.RangeFilter.LevelToMatch=ERROR
log4j.appender.ria_error.filter.RangeFilter.AcceptOnMatch=true
log4j.appender.ria_error.layout=org.apache.log4j.PatternLayout
log4j.appender.ria_error.layout.ConversionPattern=[%p] %d %c %M - %m%n
log4j.appender.ria_error.File=/home/data/logs/error.log
log4j.appender.ria_error.MaxFileSize=10MB
log4j.appender.ria_error.MaxBackupIndex=10
log4j.appender.ria_mysql=org.apache.log4j.jdbc.JDBCAppender
log4j.appender.ria_mysql.URL=jdbc:mysql://localhost/DB
log4j.appender.ria_mysql.driver=com.mysql.jdbc.Driver
log4j.appender.ria_mysql.user=xxxx
log4j.appender.ria_mysql.password=xxxxx
log4j.appender.ria_mysql.sql=INSERT INTO LOGS VALUES('%p','%d{yyyy-MM-dd HH:mm:ss}','%t','%x','%c','%m')
log4j.appender.ria_mysql.layout=org.apache.log4j.PatternLayout
# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=INFO
# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=WARN
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR
# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
现在我收到错误和致命的测试'我的数据库和error.log中的消息。但由于某种原因,信息和调试消息完全丢失。 lg.isInfoEnabled()
也返回False。
我尝试了许多关于可加性的东西,但似乎并没有解决问题。