我正在尝试使用自定义log4j文件从Spark应用程序输出日志。但是,所有数据都在一行中输出。像这样:
{"name":"org.apache.spark.storage.BlockManagerMasterEndpoint","time":1550067035445,"date":"2019-02-13 14:10:35,445","level":"INFO","thread":"dispatcher-event-loop-0","message":"Registering block manager 413.9 MB RAM, BlockManagerId(driver, spark-example-1550067018081-driver-svc.default.svc, 7079, None)"}{"name":"org.apache.spark.storage.BlockManagerMaster","time":1550067035451,"date":"2019-02-13 14:10:35,451","level":"INFO","thread":"main","message":"Registered BlockManager BlockManagerId(driver, spark-driver, 7079, None)"}
我可以为多行日志输出定义的附加器转换模式或任何其他设置?
Log4j文件:
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j='http://jakarta.apache.org/log4j/'>
<appender name="Json" class="org.apache.log4j.ConsoleAppender">
<layout class="org.apache.hadoop.log.Log4Json">
</layout>
</appender>
<root>
<level value="INFO"/>
<appender-ref ref="Json"/>
</root>
</log4j:configuration>