加载Spark后丢失connectionProperties值

时间:2018-02-04 14:57:24

标签: oracle scala apache-spark

首先,我配置调用load table:

val s2gathdata = 
    spark.sqlContext
    .read
    .format("jdbc")
    .options(Map("url" ->"jdbc:oracle:thin:user/password@url:1521:orcl",
                 "dbtable" -> "table",
                 "connectionProperties" -> "oracle.jdbc.timezoneAsRegion=false"))
    .load()

执行成功。但在那之后,我调用s2gathdata.count(),它会抛出异常丢失的配置oracle.jdbc.timezoneAsRegion=false

我该怎么办?

1 个答案:

答案 0 :(得分:1)

connectionProperties应作为jdbc来电的参数提供:

val properties = new java.util.Properties()
properties.put("oracle.jdbc.timezoneAsRegion", "false"

val s2gathdata = 
    spark.sqlContext
    .read
    .format("jdbc")
    .jdbc("jdbc:oracle:thin:user/password@url:1521:orcl", "table", properties)

您还可以提供在连接字符串中编码的属性。