如何将数据从Spark RDD放入Mysql表

时间:2018-08-21 06:57:18

标签: mysql apache-spark apache-spark-sql

我必须将数据从Spark RDD移动到Mysql Table。有人可以帮我吗?

1 个答案:

答案 0 :(得分:1)

一个可能的例子,但这是让您前进的一种样式:

import org.apache.spark.sql.SaveMode

// Write out and create the table as well to mysql via DF Writer API 

val RDD = spark.sparkContext.parallelize(Seq(
    ("A2", "X", "done"),
    ("A2", "Y", "done"),
    ("C2", "Y", "done"),
    ("B2", "Y", "done"),
    ("Z2", "Z", "done")
  ))

val jdbcDF = RDD.toDF("Company", "Type", "Status")


// Saving data to a JDBC source - creates the table as well 

jdbcDF.write
   .format("jdbc")
   .option("url", "jdbc:mysql://db4free.net:3306/mySQLDB")
   .option("dbtable", "tabname")
   .option("user", "xxx")
   .option("password", "yyy")
 //.save()
   .mode(SaveMode.Append)

您可能需要设置驱动程序,因为我在Databricks下运行该驱动程序。