pyspark.sql.utils.IllegalArgumentException:u"实例化时出错' org.apache.spark.sql.hive.HiveSession StateBuilder':"

时间:2018-01-27 09:00:20

标签: hadoop apache-spark pyspark

我试图启动pyspark,并继续收到此错误。我通过处理tmp \ hive目录的权限解决了一次。但它再次出现,看起来无法解决

  Python 2.7.13 (v2.7.13:a06454b1afa1, Dec 17 2016, 20:42:59) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/01/27 14:24:31 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using built
in-java classes where applicable
Traceback (most recent call last):
  File "D:\spark-2.2.1-bin-hadoop2.7\spark-2.2.1-bin-hadoop2.7\python\pyspark\shell.py", line 45, in <module>
    spark = SparkSession.builder\
  File "D:\spark-2.2.1-bin-hadoop2.7\spark-2.2.1-bin-hadoop2.7\python\pyspark\sql\session.py", line 183, in ge
tOrCreate
    session._jsparkSession.sessionState().conf().setConfString(key, value)
  File "D:\spark-2.2.1-bin-hadoop2.7\spark-2.2.1-bin-hadoop2.7\python\lib\py4j-0.10.4-src.zip\py4j\java_gatewa
y.py", line 1133, in __call__

  File "D:\spark-2.2.1-bin-hadoop2.7\spark-2.2.1-bin-hadoop2.7\python\pyspark\sql\utils.py", line 79, in deco
    raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSession
StateBuilder':"

我几乎尝试过所有事情。有人可以帮帮我吗

1 个答案:

答案 0 :(得分:0)

https://techgimmick.wordpress.com/2018/03/19/error-while-instantiating-org-apache-spark-sql-hive-hivesession-statebuilder/

这篇博客文章很好地解释了发生了什么以及解决方案是什么。

它说,这个问题更可能是由于您的公司IT策略,这可能会阻止您修改hive / tmp目录的文件权限。

要检查是否已成功修改权限,请使用ls命令而不是chmod。

如果它给您一个错误,指出无法建立信任,则需要与您的IT团队联系。或者从域中删除您的系统并再次添加,如Microsoft在此处所述

https://support.microsoft.com/en-in/help/2771040/the-trust-relationship-between-this-workstation-and-the-primary-domain