Spark没有将log4j日志写入stderr,为什么?

时间:2020-08-07 13:29:48

标签: apache-spark hadoop log4j stderr

我什至以错误的类名提交了作业,以强制ClassNotFound异常,但stderr仍然没有漏洞。

Log Type: stderr    
Log Upload Time: Fri Aug 07 13:16:07 +0000 2020    
Log Length: 465    
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.6.2-2/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.6.2-2/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

我看到我有多个绑定,但是过去即使有多个绑定,日志记录也能正常工作。

0 个答案:

没有答案