如何将Spark日志转发到Jupyter Notebook?

时间:2019-04-24 10:44:21

标签: apache-spark pyspark jupyter-notebook

我知道我可以通过spark.sparkContext.setLogLevel('INFO')设置日志级别,如下所示的日志会出现在终端中,但不会出现在jupyter笔记本中。

2019-03-25 11:42:37 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2019-03-25 11:42:37 WARN  SparkConf:66 - In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
2019-03-25 11:42:38 WARN  Utils:66 - Service 'SparkUI' could not bind on port 4040. Attempting port 4041.

spark会话是在jupyter笔记本单元中以本地模式创建的。

spark = SparkSession \
    .builder \
    .master('local[7]') \
    .appName('Notebook') \
    .getOrCreate()

有什么方法可以将日志转发到jupyter笔记本?

0 个答案:

没有答案