如何在Spark结构化流中使用连接池覆盖ForeachWriter

时间:2019-04-16 10:46:28

标签: apache-spark

我想在Spark结构化流中使用连接池,但是我不想使用writeStream来新建连接池,我应该如何更新?

现在,我想在application.conf中设置变量,该对象在获取main.method中的application.conf位置之前被加载,如下所示:

class JdbcSink extends ForeachWriter[Row] with Serializable with Settings {
  self: SinkStatement =>

  import JdbcSink.dsPool

  var connection: Connection = _
  //the sql statement
  var statement: Statement = _

  //open
  def open(partitionId: Long, version: Long): Boolean = {
    connection = dsPool.getConnection()
    statement = connection.createStatement()
    true
  }

  //execute
  def process(value: Row): Unit = {
    //execute
    statement.executeUpdate(this.make(value))
  }


  //close
  def close(errorOrNull: Throwable): Unit = {
    //close the connection
    dsPool.evictConnection(connection)
  }

}
object JdbcSink extends Settings {


  val config = new HikariConfig
  config.setDriverClassName(this.sinkDbDriver)
  config.setJdbcUrl(this.sinkDbUrl)
  config.setUsername(this.sinkDbUser)
  config.setPassword(this.sinkDbPwd)
  config.addDataSourceProperty("cachePrepStmts", "true")
  config.addDataSourceProperty("prepStmtCacheSize", "250")
  config.addDataSourceProperty("prepStmtCacheSqlLimit", "2048")
  val dsPool = new HikariDataSource(this.HConfig)
}

如何获得灵活的设置?像这样,“通过设置”获得“无法初始化jdbcSink”例外

      val config = new HikariConfig
      config.setDriverClassName (this.sinkDbDriver)
      config.setJdbcUrl (this.sinkDbUrl)
      config.setUsername (this.sinkDbUser)
      config.setPassword (this.sinkDbPwd)
      config.addDataSourceProperty ("cachePrepStmts", "true")
      config.addDataSourceProperty ("prepStmtCacheSize", "250")
      config.addDataSourceProperty ("prepStmtCacheSqlLimit", "2048")
      val dsPool = new HikariDataSource (this.HConfig)

0 个答案:

没有答案