Spark-Submit似乎不支持python类@property,但pyspark确实如此。我有一个名为test.py的python文件,它有基本的属性示例代码
class C(object):
@property
def x(self):
return self._x
@x.setter
def x(self, value):
print "Setting Value"
self._x = value
def __init__(self):
self._x = None
if __name__ == '__main__':
ob = C()
ob.x = "test"
当我和pyspark一起跑时,我得到:
user@machine:/src$ pyspark test.py
WARNING: Running python applications through 'pyspark' is deprecated as of Spark 1.0.
Use ./bin/spark-submit <python file>
Setting Value
&#34;设定价值&#34;告诉我它有效。但是当我使用spark-submit运行时,我得到:
user@machine:/src$ spark-submit test.py
user@machine:/src$
x.setter从未运行过。进一步测试表明正在设置属性,但忽略了setter函数。
关于为什么spark-submit忽略了setter函数以及如何解决它的任何想法?