如何在keras层中更改正则化参数而无需在R中重建新模型

时间:2018-11-11 18:10:27

标签: r keras

我想使用for循环方法在最后一个keras层中微调我的L2参数。我的目标是建立一个极限机器学习模型。现在,我正在使用以下代码:

#possible values for L2... 
k = 2^(seq(-20,-1,1)) 

#vectors with metrics 
acc_vector = vector('numeric',length(k)) 
loss_vector = vector('numeric',length(k))

for(i in seq_along(k)){

model0 = keras_model_sequential() %>% 
    layer_dense(units = 500,activation = 'relu',input_shape = c(784),
                trainable = F,name = 'dense1')  %>% 
    layer_dense(units = 10, activation = 'softmax',
                kernel_regularizer = regularizer_l2(k[i]),name='dense2') %>% 
    compile(loss = 'categorical_crossentropy',optimizer = optimizer_rmsprop(),
            metrics = c('accuracy'))

model0 %>% fit(
  x_train, y_train, 
  epochs = 5, batch_size = 512, 
  validation_split = 0.2,verbose=0)

eval = model0 %>% evaluate(x_test, y_test)

acc_vector[i] = eval$acc
loss_vector[i] = eval$loss

#I don't know why, but without the next 2 lines, my memory usage increase 2 times
rm(model0,eval)
gc()
}

所以,这是我的问题。有了这个方法(至少要快速运行),我的权重在每个循环中都是随机开始的,L2的值没有任何意义。我尝试了其他方法,例如在第一层中包括weights =“ weights”,并且工作得很好,除了在处理时间方面...增加了很多!之后,我尝试弹出最后一层,并使用新的L2添加新层,如下所示:

model0 = keras_model_sequential() %>% 
    layer_dense(units = 500,activation = 'relu',input_shape = c(784),
                trainable = F,name = 'dense1')  %>% 
    layer_dense(units = 10, activation = 'softmax',
                kernel_regularizer = regularizer_l2('any value'),name='dense2')

for(i in seq_along(k)){
  model0 %>% pop_layer() %>% 
  layer_dense(units = 10, activation = 'softmax',
              kernel_regularizer = regularizer_l2(k[i]),name='dense2')
}

但是不起作用。最后一种方法的行为使模型仅具有第一层。我只想更改L2的值以重新训练模型的最后一层。我该如何简单地做到这一点?

0 个答案:

没有答案