将spark数据帧转换为sparklyR表“tbl_spark”

时间:2018-01-16 19:48:18

标签: r apache-spark sparklyr

我正在尝试将spark数据帧org.apache.spark.sql.DataFrame转换为sparklyr表tbl_spark。我尝试使用sdf_register,但失败并出现以下错误。

在这里,df是spark数据帧。

sdf_register(df, name = "my_tbl")

错误是,

Error: org.apache.spark.sql.AnalysisException: Table not found: my_tbl; line 2 pos 17
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:306)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:315)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$9.applyOrElse(Analyzer.scala:310)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:57)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:56)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:54)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:281)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)

我错过了什么吗?或者有没有更好的方法将其转换为tbl_spark

谢谢!

1 个答案:

答案 0 :(得分:1)

使用sdf_copy_to()dplyr::copy_to(),例如my_tbl <- sdf_copy_to(sc, df, "my_tbl")