R neuralnet包,耗尽内存

时间:2016-01-13 20:12:34

标签: r amazon-web-services memory machine-learning neural-network

我正在对300k x 24输入进行分类任务,相应地具有300k x 25输出。我收到“被杀”的错误消息。在this问题后,我发现OutOfMemory已被提出。我在Amazon c4.large instance上运行模拟。我希望删除环境变量(如标签,val)可以解决问题,但事实并非如此。有关如何绕过问题的任何想法?我正在运行的代码可以在下面找到:

numLabels <- 25
numInput <- 24
newThreshold <- 10000
# 25th col of vals represents the data quality 
# Will not be used in training

randperm <- sample(length(vals$V1))

train <- cbind(vals, labels)
train <- train[randperm,]
rm(list=c('labels', 'vals'))

f <- as.formula(X1 + X2 + X3 + X4 + X5 + X6 + X7 + X8 + X9 + X10 + X11 + X12 + X13 + X14 + X15 + X16 + X17 + X18 + X19 + X20 + X21 + X22 + X23 + X24 + X25 ~ V1 + V2 + V3 + V4 + V5 + V6 + V7 + V8 + V9 + V10 + V11 + V12 + V13 + V14 + V15 + V16 + V17 + V18 + V19 + V20 + V21 + V22 + V23 + V24, env = train)
nn <- neuralnet(formula = f, threshold = newThreshold, data=train, hidden = c(100), linear.output=FALSE, err.fct='ce', act.fct='logistic', lifesign = 'full', lifesign.step = 100, stepmax=10000, rep = 2)

prediction <- compute(nn, train[,1:numInput])$net.result > 0.5
print(mean(prediction == train[,numInput+1:ncol(train)]))

1 个答案:

答案 0 :(得分:0)

您是否尝试过扩展内存限制?

memory.limit()

请看这篇关于这个主题的博客文章:

http://www.r-bloggers.com/memory-limit-management-in-r/