我正在使用IntelliJ并尝试从MySql DB获取数据,然后将其写入Hive表。 但是,我无法将RDD注册到临时表。错误是“无法解析符号registerTempTable”。
我知道此问题是由于缺少某些进口货造成的,但我无法找出哪个进口货。
我在这个问题上困扰了很长时间,并尝试了堆栈溢出时可用的所有选项/答案。
下面是我的代码:
import java.sql.Driver
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.JdbcRDD
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.hive.HiveContext
object JdbcRddExample {
def main(args: Array[String]): Unit = {
val url = "jdbc:mysql://localhost:3306/retail_db"
val username = "retail_dba"
val password ="cloudera"
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val hiveContext = new HiveContext(sc)
import hiveContext.implicits._
Class.forName("com.mysql.jdbc.Driver").newInstance
val conf = new SparkConf().setAppName("JDBC RDD").setMaster("local[2]").set("spark.executor.memory","1g")
val sc = new SparkContext(conf)
val myRDD = new JdbcRDD( sc, () => DriverManager.getConnection(url,username,password) ,
"select department_id,department_name from departments limit ?,?",
0,999999999,1, r => r.getString("department_id") + ", " + r.getString("department_name"))
myRDD.registerTempTable("My_Table") // error: Not able to resolve registerTempTable
sqlContext.sql("use my_db")
sqlContext.sql("Create table my_db.depts (department_id INT, department_name String")
我的SBT :(我相信我已经导入了所有工件)
name := "JdbcRddExample"
version := "0.1"
scalaVersion := "2.11.12"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1"
// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.3.1" % "provided"
// https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.7.1"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"
libraryDependencies += "org.apache.logging.log4j" % "log4j-api" % "2.11.0"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.11.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.1",
"org.apache.spark" %% "spark-sql" % "2.3.1",
"org.apache.spark" %% "spark-mllib" % "2.3.1",
"mysql" % "mysql-connector-java" % "5.1.12"
)
// https://mvnrepository.com/artifact/org.apache.spark/spark-hive
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.3.1" % "provided"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.1"
请向我指出我缺少的确切进口商品。还是有另一种方法。就像我在尝试所有解决方案之前所提到的那样,到目前为止没有任何效果。
答案 0 :(得分:1)
要使用Spark-sql,您可能需要一个DataFrame而不是一个RDD,后者显然没有能力registerTempTable
。
您可以通过将RDD转换为DataFrame(例如How to convert rdd object to dataframe in spark)来快速解决。但是建议使用SparkSql功能来读取JDBC数据源,例如示例here。示例代码:
val dfDepartments = sqlContext.read.format("jdbc")
.option("url", url)
.option("driver", "com.mysql.jdbc.Driver")
.option("dbtable", "(select department_id,department_name from departments) t")
.option("user", username)
.option("password", password).load()
dfDepartments.createOrReplaceTempView("My_Table")