当我使用Spark 1.6时,代码工作正常:
ddl = sqlContext.sql("""show create table {mytable }""".format(mytable="""mytest.my_dummytable"""))
map(''.join, ddl\
.map(lambda my_row: [str(data).replace("`", "'") for data in my_row])\
.collect())
然而,当我转向Spark 2.2时,我遇到以下异常:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-> in <module>()
1 ddl = sqlContext.sql("""show create table {mytable }""".format(mytable ="""mytest.my_dummytable"""))
----> 2 map(''.join, ddl..map(lambda my_row: [str(data).replace("`", "'") for data in my_row]).collect())
spark2/python/pyspark/sql/dataframe.py in __getattr__(self, name)
if name not in self.columns:
raise AttributeError(
-> "'%s' object has no attribute '%s'" % (self.__class__.__name__, name))
jc = self._jdf.apply(name)
return Column(jc)
AttributeError: 'DataFrame' object has no attribute 'map'