我正在使用Sparklyr库从R中读取和写入数据。读取数据可以按预期工作,但函数copy_to却不能。我读取了一个sas.7bdat数据集,并且想要复制到Spark。 我正在使用Spark 2.3.0。
> sessionInfo()
R version 3.5.1 (2018-07-02)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows >= 8 x64 (build 9200)
attached base packages:
[1] stats graphics grDevices utils datasets
[6] methods base
other attached packages:
[1] DBI_1.0.0 magrittr_1.5
[3] dplyr_0.7.6 spark.sas7bdat_1.2
[5] sparklyr_0.8.4
使用的密码:
library(sparklyr)
library(spark.sas7bdat)
library(dplyr)
library(magrittr)
conf <- spark_config()
conf$`sparklyr.shell.driver-memory` <- "254G"
conf$spark.memory.fraction <- .8
sc <- spark_connect(master = "local",config = conf)
mysasfile <- system.file("extdata", "iris.sas7bdat", package = "spark.sas7bdat")
mysasfile<-"./base_12.sas7bdat"
x <- spark_read_sas(sc, path = mysasfile, table = "sas_example")
copy_to(sc,tab1)
和下一个错误:
Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 4 in stage 2.0 failed 1 times, most recent failure: Lost
task 4.0 in stage 2.0 (TID 6, localhost, executor driver):
错误:
*java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.github.saurfang.sas.util.PrivateMethodCaller.apply(PrivateMethodExposer.scala:11)
at com.github.saurfang.sas.mapred.SasRecordReader.readNext$lzycompute$1(SasRecordReader.scala:119)
at com.github.saurfang.sas.mapred.SasRecordReader.readNext$1(SasRecordReader.scala:118)
at com.github.saurfang.sas.mapred.SasRecordReader.next(SasRecordReader.scala:131)
at*