使用PySpark中的广播对象调用UDF时出错

时间:2017-11-14 10:12:31

标签: pyspark apache-spark-sql spark-dataframe user-defined-functions

我正在尝试调用在PySpark中使用广播对象的UDF。

以下是重现情况和错误的最小示例:

import pyspark.sql.functions as sf
from pyspark.sql.types import LongType


class SquareClass:
    def compute(self, n):
        return n ** 2


square = SquareClass()
square_sc = sc.broadcast(square)

def f(n):
    return square_sc.value.compute(n)  

numbers = sc.parallelize([{'id': i} for i in range(10)]).toDF()
f_udf = sf.udf(f, LongType())  

numbers.select(f_udf(numbers.id)).show(10)

此代码段产生的堆栈跟踪和错误消息:

Traceback (most recent call last)
<ipython-input-75-6e38c014e4b2> in <module>()
     13 f_udf = sf.udf(f, LongType())
     14 
---> 15 numbers.select(f_udf(numbers.id)).show(10)

/usr/hdp/current/spark-client/python/pyspark/sql/dataframe.py in show(self, n, truncate)
    255         +---+-----+
    256         """
--> 257         print(self._jdf.showString(n, truncate))
    258 
    259     def __repr__(self):

/usr/local/lib/python3.5/dist-packages/py4j/java_gateway.py in __call__(self, *args)
   1131         answer = self.gateway_client.send_command(command)
   1132         return_value = get_return_value(
-> 1133             answer, self.gateway_client, self.target_id, 

<snip>

An error occurred while calling o938.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 49.0 failed 1 times, most recent failure: Lost task 1.0 in stage 49.0 (TID 587, localhost): org.apache.spark.api.python.PythonException: Traceback (most recent call last):

2 个答案:

答案 0 :(得分:2)

在调用square_sc的属性时,您需要调用工作人员不在的模块SquareClass

如果你想在UDF中使用python包,类,函数,工作者应该能够访问它,你可以通过将代码放在python脚本中并使用{{{{}}来部署它来实现这一点。运行--py-filesspark-submit

时1}}

答案 1 :(得分:1)

您可以做的一件事是,将该类保持为单独的模块,并将模块添加到sparkContext。

DropDownChoice