f = lambda x: str(x)
with SparkContext("local", "HelloWorld") as sc:
spark = SQLContext(sc)
spark.udf.register("f", f)
This code works to register the python udf once so it can be called such as with:
%sql "select f(col_name) from table_name"
But the function does not change the next time this gets called (after f has been redefined)! How do you redefine a udf, i.e. re-register it so as the overwrite the old udf. Is there a drop_udf function, etc.?