SaveMode在Spark SQL中不起作用

时间:2016-03-21 20:16:05

标签: scala apache-spark apache-spark-sql

我使用SaveMode选项运行Spark SQL示例但是出现以下错误。

val df = sqlContext.read.format("json").load("/user/root/spark/data/people.json")
df.select("name","age").write.format("json").save("Output",SaveMode.ErrorIfExist)


<console>:35: error: overloaded method value save with alternatives:
  ()Unit <and>
  (path: String)Unit
 cannot be applied to (String, org.apache.spark.sql.SaveMode)
              df.select("name", "age").write.format("json").save("Output",SaveMode.ErrorIfExists

我检查了文档,它说不推荐使用SaveMode。我该如何解决这个问题?

任何建议。

1 个答案:

答案 0 :(得分:3)

您可以使用DataFrameWriter.mode方法:

df.write.mode("error").save(...)

df.write.mode(SaveMode.ErrorIfExists).save(...)