我一直在研究增加Mac上R的内存限制的方法,但是还没有发现任何特别有用的方法。我同时尝试了Increasing memory limit in R for Mac和R on MacOS Error: vector memory exhausted (limit reached?),但都没有。
我正在跑步
fviz_nbclust(df, kmeans, method = "wss")
处理一组1.79m行和2列的数据。错误表明向量内存已耗尽(已达到极限?)。我已经尝试过memory.limit()
,但这在Mac上不起作用。
sessionInfo如下:
R version 3.6.1 (2019-07-05)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Sierra 10.12.6
Matrix products: default
BLAS: /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRblas.0.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib
locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] factoextra_1.0.6 ggplot2_3.2.1
loaded via a namespace (and not attached):
[1] ggrepel_0.8.1 Rcpp_1.0.3 withr_2.1.2 assertthat_0.2.1
[5] crayon_1.3.4 dplyr_0.8.3 grid_3.6.1 R6_2.4.1
[9] lifecycle_0.1.0 gtable_0.3.0 magrittr_1.5 scales_1.1.0
[13] pillar_1.4.2 rlang_0.4.1 lazyeval_0.2.2 tools_3.6.1
[17] glue_1.3.1 purrr_0.3.3 munsell_0.5.0 compiler_3.6.1
[21] pkgconfig_2.0.3 colorspace_1.4-1 tidyselect_0.2.5 tibble_2.1.3
>
有人可以帮忙吗?谢谢!
答案 0 :(得分:0)
与其增加内存限制(链接问题中的解决方案),您应该评估您当前的代码。优化代码通常会弥补硬件的限制。
有几篇关于这个主题的文章 (1)(2)。您应该特别查看包 data.table
(3) 和/或 bigmemory
(4) 和 ff
(5)。后者创建了一个虚拟的 data.frame,但可以像 R 中的任何常规对象一样处理。以下是后一篇文章修改的示例:
library(ff)
# Import the file
bigobj.ff <- read.csv.ffdf(file="bigfile.csv")
# Check the object
class(bigobj.ff)
## [1] "ffdf"
# bigobj.ff is a virtual dataframe, but you can still perform normal operations
sum(bigobj.ff[,3])
## [1] 66029