替换' \\'或' \\\\'通过sparklyr在spark数据帧中失败

时间:2018-02-05 09:57:06

标签: r apache-spark backslash sparklyr

我尝试替换spark数据帧中的反斜杠。我写了一个与R数据帧配合得很好的功能。我将其插入spark_apply并且无效:

rm(back_slash_replace_func)

back_slash_replace_func <- function(x)
{

     cbind.data.frame(
          lapply(
          x, function(x) { if(class(x) == 'character'){ gsub(pattern = "\\", replacement = "/", x = x, fixed = T)} else { x } }
            )
     , stringsAsFactors = F
     )

}

## do in R

x <- data.frame(x = rep('\\', 10), stringsAsFactors = F)

back_slash_replace_func(x)

## do in spark

r_spark_connection <- spark_connect(master = "local")

xsp <- copy_to(r_spark_connection, x, overwrite = T)

start <- Sys.time()

spark_apply(
               x = xsp
               , f = back_slash_replace_func
               , memory = F
               )

Sys.time() - start

它没有完成工作,没有错误,没有警告。可能是这种情况?

1 个答案:

答案 0 :(得分:2)

您应该注意的第一件事是copy_to您的数据格式不正确。虽然x是:

x %>% head(1)
#    x
# 1 \\

xsp

xsp %>% head(1)
# # Source:   lazy query [?? x 1]
# # Database: spark_connection
#   x    
#   <chr>
# 1 "\"" 

这是因为当您使用copy_to时,spakrlyr会将数据转储到平面文件。因此,它甚至无法在本地工作:

xsp %>% collect %>% back_slash_replace_func %>% head(1)
#   x
# 1 "

如果您直接创建数据框:

df <-spark_session(r_spark_connection) %>%
  invoke("sql", "SELECT '\\\\' AS x FROM range(10)") %>% 
  sdf_register() 

df %>% collect %>% back_slash_replace_func %>% head(1)
#   x
# 1 /

这个特殊问题不会出现。

此处的另一个问题是,spark_apply实际上将strings转换为factors(根据Kevin's跟踪的sparklyr:1295评论)的:

function(x) {
  if (is.character(x)) {
    gsub(pattern = "\\", replacement = "/", x = x, fixed = T)
  } else { x }
}

你宁愿需要:

function(x) {
  if (is.factor(x)) {
    gsub(pattern = "\\", replacement = "/", x = as.character(x), fixed = T)
  } else { x }
}

但实际上只是translate

df %>% mutate(x = translate(x, "\\\\", "/")) %>% head(1)
# # Source:   lazy query [?? x 1]
# # Database: spark_connection
#   x    
#   <chr>
# 1 /