from pyspark import ...
from pyspark import ...
spark = SparkSession.builder \
.master("local") \
.appName("Hive") \
.getOrCreate()
srcDF=spark.sql(""" select name,city,age,location from table where last_run_date > '1900-01-01' """)
srcDF.withColumn('newcol', srcDF.age + 2)
srcDF.write.format('json').save('/nwe'))
现在我想在每个commamd之后添加错误处理,就像我们用$?做shell脚本一样。
我们可以使用try
/ except
块来应用通用错误处理吗?
新代码应具有以下内容:
from pyspark import ...
from pyspark import ...
spark = SparkSession.builder \
.master("local") \
.appName("Hive") \
.getOrCreate()
srcDF=spark.sql(""" select name,city,age,location from table where last_run_date > '1900-01-01' """)
<error handling>
srcDF.withColumn('newcol', srcDF.age + 2)
<error handling>
srcDF.write.format('json').save('/nwe'))
<error handling>