我需要在日志中处理特定exeptions的重复。
我使用slf4j和logback登录我的应用程序。我使用一些外部服务(DB,apache kafka,第三方库等)。当连接丢失到这样的服务时,我得到了例外,例如
Model.select(:state, :city).group{|add| address.state}.each{|_, v| v.replace(v.map {|add| add.city})}.to_json
问题是我每秒都收到这条消息。此异常消息将充斥我的日志文件,因此我将在N小时内在日志文件中有几个GB。
我希望每1-5分钟有一次关于此异常的日志消息。有没有办法忽略日志文件中的异常重复?
可能的解决方案:
忽略特定包和类的所有日志。 [不好,因为我可以跳过重要的信息]
使用http://logback.qos.ch/manual/filters.html#DuplicateMessageFilter [糟糕,因为我只能设置属性AllowedRepetitions或CacheSize。它将匹配所有消息,但我只需要特定的例子]
编写自定义过滤器 也许,你知道已经有了解决方案吗?
答案 0 :(得分:0)
我认为您最好的选择只是扩展您已找到的DuplicateMessageFilter。这不是最终的,而且相当容易:
使用基于classname或exceptionType的过滤方法或您希望根据
然后委托父类进行duplicity-check
您可以使用的参数:
public FilterReply decide(Marker marker, Logger logger, Level level,
String format, Object[] params, Throwable t) {
包括Throwable
和Logger
。
答案 1 :(得分:0)
编写新的turbo过滤器并实现任何逻辑来拒绝某些特定的日志记录非常容易。
我已经使用下一个配置添加了新过滤器 的 logback.xml 强>
<turboFilter class="package.DuplicationTimeoutTurboFilter">
<MinutesToBlock>3</MinutesToBlock>
<KeyPattern>
<loggerClass>org.apache.kafka.common.network.Selector</loggerClass>
<message>java.net.ConnectException: Connection refused: no further information</message>
</KeyPattern>
</turboFilter>
和实施:
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.turbo.TurboFilter;
import ch.qos.logback.core.spi.FilterReply;
import org.slf4j.Marker;
import java.time.LocalDateTime;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Objects;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;
public class DuplicationTimeoutTurboFilter extends TurboFilter {
private static final int CLEAN_UP_THRESHOLD = 1000;
private ConcurrentHashMap<KeyPattern, LocalDateTime> recentlyMatchedPatterns = new ConcurrentHashMap<>();
private Set<KeyPattern> ignoringPatterns = new HashSet<>();
private long minutesToBlock = 3L;
@Override
public FilterReply decide(Marker marker, Logger logger, Level level, String format, Object[] params, Throwable t) {
String rawLogMessage = format + Arrays.toString(params) + Objects.toString(t); //sometimes throwable can be inserted into params argument
Set<KeyPattern> matchedIgnoringSet = ignoringPatterns.stream()
.filter(key -> match(key, logger, rawLogMessage))
.collect(Collectors.toSet());
if (!matchedIgnoringSet.isEmpty() && isLoggedRecently(matchedIgnoringSet)) {
return FilterReply.DENY;
}
return FilterReply.NEUTRAL;
}
private boolean match(KeyPattern keyPattern, Logger logger, String rawText) {
String loggerClass = keyPattern.getLoggerClass();
String messagePattern = keyPattern.getMessage();
return loggerClass.equals(logger.getName()) && rawText.contains(messagePattern);
}
private boolean isLoggedRecently(Set<KeyPattern> matchedIgnoredList) {
for (KeyPattern pattern : matchedIgnoredList) {
LocalDateTime now = LocalDateTime.now();
LocalDateTime lastLogTime = recentlyMatchedPatterns.putIfAbsent(pattern, now);
if (lastLogTime == null) {
return false;
}
LocalDateTime blockedTillTime = lastLogTime.plusMinutes(minutesToBlock);
if (blockedTillTime.isAfter(now)) {
return true;
} else if (blockedTillTime.isBefore(now)) {
recentlyMatchedPatterns.put(pattern, now);
cleanupIfNeeded();
return false;
}
}
return true;
}
private void cleanupIfNeeded() {
if (recentlyMatchedPatterns.size() > CLEAN_UP_THRESHOLD) {
LocalDateTime oldTime = LocalDateTime.now().minusMinutes(minutesToBlock * 2);
recentlyMatchedPatterns.values().removeIf(lastLogTime -> lastLogTime.isAfter(oldTime));
}
}
public long getMinutesToBlock() {
return minutesToBlock;
}
public void setMinutesToBlock(long minutesToBlock) {
this.minutesToBlock = minutesToBlock;
}
public void addKeyPattern(KeyPattern keyPattern) {
ignoringPatterns.add(keyPattern);
}
public static class KeyPattern {
private String loggerClass;
private String message;
//constructor, getters, setters, equals, hashcode
}
}