致力于Spring Boot和Spring Scheduler项目以运行多个批处理。
这是我要写入一个日志文件的所有批次信息。(One log file got created
)
现在我需要在不同批次i.e no. of batches = that many no. of log files
的单独日志文件中写入信息。
请注意,由于我使用的是Spring Boot,所以我只有一个主类,并且所有批次都只包含一个包,所有批次都只有一个服务,所有批次都只有一个存储库。
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" monitorInterval="30">
<Properties>
<Property name="LOG_PATTERN">$${ctx:filename} %d %p %c{1.} [%t] %m%n
</Property>
<Property name="APP_LOG_ROOT">C:/job-logs/claims-dms/</Property>
<Property name="APP_LOG_BACK_ROOT">C:/job-logs/claims-dms/back/</Property>
</Properties>
<Appenders>
<Console name="Console" target="SYSTEM_OUT" follow="true">
<PatternLayout pattern="${LOG_PATTERN}" />
</Console>
<RollingFile name="appLog" fileName="${APP_LOG_ROOT}claims-dms.log"
filePattern="${APP_LOG_BACK_ROOT}claims-dms-%d{yyyy-MM-dd}-%i.log.gz">
<PatternLayout pattern="${LOG_PATTERN}" />
<Policies>
<SizeBasedTriggeringPolicy size="500MB" />
<TimeBasedTriggeringPolicy interval="1"
modulate="true" />
</Policies>
<DefaultRolloverStrategy max="1" />
</RollingFile>
</Appenders>
<Loggers>
<Logger name="com.bct" additivity="false" level="all">
<AppenderRef ref="appLog" />
<AppenderRef ref="Console" />
</Logger>
<Logger name="org.hibernate.SQL" additivity="false" level="all">
<AppenderRef ref="appLog" />
<AppenderRef ref="Console" />
</Logger>
<Logger name="org.hibernate.type.descriptor.sql" additivity="false"
level="all">
<AppenderRef ref="appLog" />
<AppenderRef ref="Console" />
</Logger>
<Logger name="org.springframework.jdbc.core" additivity="false"
level="all">
<AppenderRef ref="appLog" />
<AppenderRef ref="Console" />
</Logger>
<Root>
<AppenderRef ref="Console" />
</Root>
</Loggers>
</Configuration>
答案 0 :(得分:0)
您可以将批次名称放在触发您的工作的MDC中,并在logback.xml中使用该密钥
@Scheduled
public void scheduleJob(){
MDC.put("jobname", jobName);
// other stuff
}
和logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="SIFT" class="ch.qos.logback.classic.sift.SiftingAppender">
<!-- in the absence of the class attribute, it is assumed that the
desired discriminator type is
ch.qos.logback.classic.sift.MDCBasedDiscriminator -->
<discriminator>
<key>jobName</key>
<defaultValue>batch-service</defaultValue>
</discriminator>
<sift>
<appender name="FILE-${jobName}" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${server.docroot}/logs/${jobName}.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>${server.docroot}/logs/${jobName}.%i.log</fileNamePattern>
<minIndex>1</minIndex>
<maxIndex>5</maxIndex>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>100MB</maxFileSize>
</triggeringPolicy>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d [%thread] %level %mdc %logger{35} - %msg%n</pattern>
</layout>
</appender>
</sift>
</appender>
<root level="INFO">
<appender-ref ref="SIFT" />
</root>
</configuration>
现在为每个作业创建一个新的日志文件。