如何在Mathematica神经网络中实现学习率衰减或循环学习率

时间:2020-01-08 14:11:07

标签: machine-learning deep-learning neural-network wolfram-mathematica

我正在使用Mathematica NetTrain []函数来训练神经网络。有一种方法可以设置学习速率(LearningRate和LearningRateMultiplier选项),但是我希望降低学习速率或根据损失的发展而变化。

当前对具有固定学习率的转移学习的呼吁:

NetTrain[preTrainedNet, trainData, All, ValidationSet -> valData, 
  MaxTrainingRounds -> epochs, TargetDevice -> "GPU", 
  LearningRateMultipliers -> {"classifier" -> 
     lr, {"base", 1, "conv_conv2d"} -> lr, {"base", 1, "conv_relu"} ->
      lr, _ -> 0}, BatchSize -> 8,
  TrainingProgressCheckpointing -> {"Directory", 
    "C:\\DataSets\\RZ-DL-Aug-Pre", 
    "Interval" -> Quantity[10, "Rounds"]}];```

How to implement?


0 个答案:

没有答案