使用时出现“找不到表”错误
SQLContext sqlCon=SQLContext.getOrCreate(saprkContext.sc());
堆栈跟踪如下:
18/08/16 19:58:31 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
Exception in thread "main" org.apache.spark.sql.AnalysisException: Table not found: `schema`.`tableName`;
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:54)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:50)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:121)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:120)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:120)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:120)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:50)
at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:44)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:35)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:133)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:829)
at com.ktk.ccemi.MIProcess$1StaticDataLoader.loadData(MIProcess.java:211)
at com.ktk.ccemi.MIProcess$1StaticDataLoader.<init>(MIProcess.java:167)
at com.ktk.ccemi.MIProcess.createContext(MIProcess.java:234)
at com.ktk.ccemi.MIProcess$1.call(MIProcess.java:74)
at com.ktk.ccemi.MIProcess$1.call(MIProcess.java:1)
at org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$10.apply(JavaStreamingContext.scala:776)
我正在使用带有检查点的Spark流,而我的Spark版本是1.6。
答案 0 :(得分:0)
要访问Hive表,您需要:
HiveContext
在Spark 1.x中SparkSession
,在Spark 2.x中启用了Hive支持。不支持通过基本的SQLContext
访问Hive表。