我正在尝试为Splunk HTTP事件收集器实现自定义AppenderFactory。我写了一个简单的类,如下,
package com.example.app;
import ch.qos.logback.classic.LoggerContext;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.Appender;
import ch.qos.logback.core.AppenderBase;
import com.fasterxml.jackson.annotation.JsonTypeName;
import io.dropwizard.logging.AbstractAppenderFactory;
import io.dropwizard.logging.async.AsyncAppenderFactory;
import io.dropwizard.logging.filter.LevelFilterFactory;
import io.dropwizard.logging.layout.LayoutFactory;
@JsonTypeName("splunk")
public class SplunkAppenderFactory extends AbstractAppenderFactory{
@Override
public Appender build(LoggerContext context, String applicationName, LayoutFactory layoutFactory, LevelFilterFactory levelFilterFactory, AsyncAppenderFactory asyncAppenderFactory) {
System.out.println("Setting up SplunkAppenderFactory!");
final SplunkAppender appender = new SplunkAppender();
appender.setName("splunk-appender");
appender.setContext(context);
appender.start();
return wrapAsync(appender, asyncAppenderFactory);
}
}
class SplunkAppender extends AppenderBase<ILoggingEvent> {
@Override
protected void append(ILoggingEvent eventObject) {
System.out.println("Splunk: "+ eventObject.toString());
}
}
据说我们不需要连接任何东西,因为Dropwizard会自动扫描和连接东西。但是当我运行应用程序时,我得到了这个错误,
./ infrastructure / config / config.yml有错误: *无法解析配置:logging.appenders。[2];无法将类型ID“splunk”解析为[simple type,class io.dropwizard.logging.AppenderFactory]的子类型:已知类型ids = [AppenderFactory,console,file,syslog] 在[资料来源:N / A; line:-1,column:-1](通过引用链:com.example.app.AppConfiguration [“logging”] - &gt; io.dropwizard.logging.DefaultLoggingFactory [“appenders”] - &gt; java.util.ArrayList [2])
我的app.config如下,
logging:
appenders:
# log format: <Level> - <Time> - <Revision> - <Environment> - <Thread> - <Log Content>
- type: console
logFormat: "%level %d{HH:mm:ss.SSS} %mdc{revision} %mdc{environment} '%mdc{user}' %t %logger{5} - %X{code} %msg %n"
threshold: ${CONSOLE_LOG_LEVEL:-ERROR}
- type: file
threshold: INFO
logFormat: "%level %d{HH:mm:ss.SSS} %mdc{revision} %mdc{environment} '%mdc{user}' %t %logger{5} - %X{code} %msg %n"
# The file to which current statements will be logged.
currentLogFilename: ./logs/app.log
# When the log file rotates, the archived log will be renamed to this and gzipped. The
# %d is replaced with the previous day (yyyy-MM-dd). Custom rolling windows can be created
# by passing a SimpleDateFormat-compatible format as an argument: "%d{yyyy-MM-dd-hh}".
archivedLogFilenamePattern: ./logs/app-%d.log.gz
# The number of archived files to keep.
archivedFileCount: 10
# The timezone used to format dates. HINT: USE THE DEFAULT, UTC.
timeZone: UTC
- type: splunk
logFormat: "%level %d{HH:mm:ss.SSS} %mdc{revision} %mdc{environment} '%mdc{user}' %t %logger{5} - %X{code} %msg %n"
threshold: INFO
我怎样才能完成这项工作?
答案 0 :(得分:3)
您可能必须创建一个名为:
的文件 META-INF/services/io.dropwizard.logging.AppenderFactory
在项目的资源文件夹中,此文件的内容是可用的Appender类(或多个类)的完全限定名称:
com.example.app.SplunkAppenderFactory
核心DW项目还包含带有默认appender的文件: