使用Log4j

时间:2019-02-15 09:47:24

标签: json scala apache-spark log4j

我有一个用于Spark应用程序的自定义Log4j文件。我想将Spark应用程序ID与其他属性(例如消息和日期)一起输出,以便JSON字符串结构如下所示:

{"name":,"time":,"date":,"level":,"thread":,"message":,"app_id":}

现在,此结构如下所示:

{"name":,"time":,"date":,"level":,"thread":,"message":}

如何为Spark驱动程序日志定义这种布局?

我的log4j文件如下:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j='http://jakarta.apache.org/log4j/'>

    <appender name="Json" class="org.apache.log4j.ConsoleAppender">
        <layout class="org.apache.hadoop.log.Log4Json">
            <param name="ConversionLayout" value=""/>
        </layout>
    </appender>

    <root>
        <level value="INFO"/>
        <appender-ref ref="Json"/>
    </root>
</log4j:configuration>

1 个答案:

答案 0 :(得分:1)

我怀疑org.apache.hadoop.log.Log4Json是否可以为此目的进行调整。根据其javadoc和源代码,它可能相当麻烦。

尽管看起来您正在使用Log4j 1x,但它的API非常灵活,我们可以通过扩展org.apache.log4j.Layout轻松地定义自己的布局。

我们需要根据目标结构将其转换为JSON的case类:

case class LoggedMessage(name: String,
                         appId: String,
                         thread: String,
                         time: Long,
                         level: String,
                         message: String)

并且Layout可以扩展如下。要访问“ app_id”的值,我们将使用Log4j的“映射诊断上下文”

import org.apache.log4j.Layout
import org.apache.log4j.spi.LoggingEvent
import org.json4s.DefaultFormats
import org.json4s.native.Serialization.write

class JsonLoggingLayout extends Layout {
  // required by the API
  override def ignoresThrowable(): Boolean = false
  // required by the API
  override def activateOptions(): Unit = { /* nothing */ }

  override def format(event: LoggingEvent): String = {
    // we are using json4s for JSON serialization
    implicit val formats = DefaultFormats

    // retrieve app_id from Mapped Diagnostic Context
    val appId = event.getMDC("app_id") match {
      case null => "[no_app]" // logged messages outside our app
      case defined: AnyRef => defined.toString
    }
    val message = LoggedMessage("TODO",
                                appId,
                                Thread.currentThread().getName,
                                event.getTimeStamp,
                                event.getLevel.toString,
                                event.getMessage.toString)
    write(message) + "\n"
  }

}

最后,当创建Spark会话时,我们将app_id值放入MDC:

import org.apache.log4j.{Logger, MDC}

// create Spark session

MDC.put("app_id", session.sparkContext.applicationId)

logger.info("-------- this is info --------")
logger.warn("-------- THIS IS A WARNING --------")
logger.error("-------- !!! ERROR !!! --------")

这将产生以下日志:

{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708149,"level":"INFO","message":"-------- this is info --------"}
{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708150,"level":"WARN","message":"-------- THIS IS A WARNING --------"}
{"name":"TODO","appId":"local-1550247707920","thread":"main","time":1550247708150,"level":"ERROR","message":"-------- !!! ERROR !!! --------"}

当然,不要忘记在log4j config xml中引用实现:

<appender name="Json" class="org.apache.log4j.ConsoleAppender">
  <layout class="stackoverflow.q54706582.JsonLoggingLayout" />
</appender>