NbClust - 错误:无法分配大小为x的向量

时间:2014-01-27 18:49:27

标签: r cluster-analysis k-means

我在R中运行k-means聚类,并希望使用NbClust来帮助确定最佳聚类数。我的数据集df有636,688行和7列。

当我运行NbClust(df, min.nc = 2, max.nc = 3, method = "kmeans")时,我得到:

Error: cannot allocate vector of size 1510.1 Gb
In addition: Warning messages:
1: In dist(jeu, method = "euclidean") :
  Reached total allocation of 32767Mb: see help(memory.size)
2: In dist(jeu, method = "euclidean") :
  Reached total allocation of 32767Mb: see help(memory.size)
3: In dist(jeu, method = "euclidean") :
  Reached total allocation of 32767Mb: see help(memory.size)
4: In dist(jeu, method = "euclidean") :
  Reached total allocation of 32767Mb: see help(memory.size)

这是我的sessionInfo

R version 3.0.2 (2013-09-25)
Platform: x86_64-w64-mingw32/x64 (64-bit)

locale:
[1] LC_COLLATE=English_United Kingdom.1252  LC_CTYPE=English_United Kingdom.1252    LC_MONETARY=English_United Kingdom.1252
[4] LC_NUMERIC=C                            LC_TIME=English_United Kingdom.1252    

attached base packages:
[1] stats4    grid      stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
 [1] clusterSim_0.43-3 fpc_2.1-6         flexmix_2.3-11    mclust_4.2        cluster_1.14.4    MASS_7.3-29      
 [7] flexclust_1.3-4   modeltools_0.2-21 lattice_0.20-23   NbClust_1.4       rattle_2.6.26    

loaded via a namespace (and not attached):
[1] ade4_1.6-2     class_7.3-9    e1071_1.6-2    nnet_7.3-7     parallel_3.0.2 R2HTML_2.2.1   rgl_0.93.996   tools_3.0.2 

有没有办法绕过尺寸限制?我知道我可以对样本进行采样并运行NbClust,但我发现我的样本必须是完整数据集的1%才能使NbClust能够运行 - 样本I相信它太小而无法提供帮助,即使我经常进行多次迭代。即使是小样本 - NbClust也很慢。我查看了cluster.stats包中的fpc,但这需要一个距离矩阵 - 这对我和我的大数据集都是不可能的。

0 个答案:

没有答案