配置单元log4j +为什么在/ var / log / hive下创建的日志与log4j中定义的不同

时间:2018-08-06 11:13:40

标签: linux hadoop hive log4j

我们有hadoop集群版本2.6.5(hortonworks)和基于ambari的平台GUI,

我们将log4j配置为与 RollingFileAppender MaxBackupIndex 一起使用10个备份

重新启动HIVE服务后,我们看到了以下奇怪的事情

/ var / log / hive 下,我可以看到以下日志(示例)

-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.23-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.23-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.23-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24-20180805

我不明白为什么日志会变成“ -20180803 ”? ,

因为这不是我们在hive-log4j中定义的

ambari中的hive-log4j配置示例

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Define some default values that can be overridden by system properties
hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=${hive.log.threshold}
#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender
#log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
# Rollver at midnight
log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=100MB
#log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
# 30-day backup
#log4j.appender.DRFA.MaxBackupIndex=30
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n
#
# console
# Add "console" to rootlogger above if you want to use this
#
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} [%t]: %p %c{2}: %m%n
log4j.appender.console.encoding=UTF-8
#custom logging levels
#log4j.logger.xxx=DEBUG
#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter
log4j.category.DataNucleus=ERROR,DRFA
log4j.category.Datastore=ERROR,DRFA
log4j.category.Datastore.Schema=ERROR,DRFA
log4j.category.JPOX.Datastore=ERROR,DRFA
log4j.category.JPOX.Plugin=ERROR,DRFA
log4j.category.JPOX.MetaData=ERROR,DRFA
log4j.category.JPOX.Query=ERROR,DRFA
log4j.category.JPOX.General=ERROR,DRFA
log4j.category.JPOX.Enhancer=ERROR,DRFA
# Silence useless ZK logs
log4j.logger.org.apache.zookeeper.server.NIOServerCnxn=WARN,DRFA
log4j.logger.org.apache.zookeeper.ClientCnxnSocketNIO=WARN,DRFA

请告知我们获取日志结构的原因可能是

hiveserver2.log.23-20180804-20180805

相反,要获得此正确的答案:

hiveserver2.log.23

0 个答案:

没有答案