数据帧内连接中的Pyspark awaitResult错误

时间:2018-05-21 18:43:05

标签: python apache-spark pyspark apache-spark-sql

在docker容器中运行独立的spark-2.3.0-bin-hadoop2.7

  • df1 = 5行
  • df2 = 10行
  • 数据集非常小。

    df1 schema: Dataframe[id:bigint, name:string] df2 schema: Dataframe[id:decimal(12,0), age: int]

内部加入

df3 = df1.join(df2, df1.id == df2.id, 'inner')

df3 schema: Dataframe[id:bigint, name:string, age: int]

执行df3.show(5)时,发生以下错误

Traceback (most recent call last):   File "<stdin>", line 1, in <module>   File "/usr/apache/spark-2.3.0-bin-hadoop2.7/python/pyspark/sql/dataframe.py", line 466, in collect
    port = self._jdf.collectToPython()   File "/usr/local/lib/python3.6/dist-packages/py4j/java_gateway.py", line 1257, in __call__
    answer, self.gateway_client, self.target_id, self.name)   File "/usr/apache/spark-2.3.0-bin-hadoop2.7/python/pyspark/sql/utils.py", line 63, in deco
    return f(*a, **kw)   File "/usr/local/lib/python3.6/dist-packages/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value) py4j.protocol.Py4JJavaError: An error occurred while calling o43.collectToPython. : org.apache.spark.SparkException: Exception thrown in awaitResult:
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
        at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec.doExecuteBroadcast(BroadcastExchangeExec.scala:136)

根据this suggestion尝试将广播时间设置为-1,但得到了相同的错误

conf = SparkConf().set("spark.sql.broadcastTimeout","-1")

1 个答案:

答案 0 :(得分:0)

我在Spark 2.3中使用了不兼容的JRE版本。

使用Docker Image中的openjdk-8-jre更新JRE后错误得到解决