如何在pyspark中删除警告消息

时间:2018-01-25 02:36:48

标签: python apache-spark

我设置了spark和Hadoop。

我下载了,我设置了环境。我在shell中调用了pyspark。

这是关于' pyspark'的结果。

dino@ubuntu:~$ pyspark

Python 2.7.12 (default, Dec  4 2017, 14:50:18)

[GCC 5.4.0 20160609] on linux2

Type "help", "copyright", "credits" or "license" for more information.

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

18/01/24 18:25:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

18/01/24 18:25:14 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.20.147 instead (on interface ens33)

18/01/24 18:25:14 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address

18/01/24 18:25:24 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 2.2.1
      /_/

Using Python version 2.7.12 (default, Dec  4 2017 14:50:18)

SparkSession available as 'spark'.

>>> 

(我使用的是Ubuntu 16.04,我的火花版本是2.2.1,hadoop - > 2.7.5)

有一些WARN消息,我想知道这些以及如何修复。

0 个答案:

没有答案
相关问题