我的对象有SGD
,Adam
,Adamax
之类的不同方法,我可以这样称呼它:
optim.SGD(parameters, lr, momentum=0.9)
optim.Adam(parameters, lr, momentum=0.9)
optim.Adamax(parameters, lr, momentum=0.9)
如何在循环中全部调用它。我有以下算法:
models = [..., ..., ...]
lrs = [..., ..., ...]
criterions = [..., ..., ...]
for model in models:
for criterion in criterions:
for lr in lrs:
optimizer = optim.SGD(model.params(), lr=lr, momentum=0.9)
train(model=model,
criterion=criterion,
optimizer=optimizer,
lr=lr)
如何使用我提及的所有train()
算法(optimizer
,optim.SGD
,optim.Adam
)来调用optim.Adamax
?
答案 0 :(得分:0)
您可以使用getattr
获取每个优化器的train
属性,然后调用它:
...
getattr(optimizer , 'train')(model=model,
criterion=criterion,
optimizer=optimizer,
lr=lr)
还有operator.methodcalller
,它的语法略有改变,它完全相同:
from operator import methodcaller
...
methodcaller('train', model=model,
criterion=criterion,
optimizer=optimizer,
lr=lr)(optimizer)
答案 1 :(得分:-1)
直接枚举优化功能
models = [..., ..., ...]
lrs = [..., ..., ...]
criterions = [..., ..., ...]
optim_funcs = [optim.SGD, optim.Adam, optim.Adamax]
for model in models:
for criterion in criterions:
for lr in lrs:
for func in optim_funcs:
optimizer = func(model.params(), lr=lr, momentum=0.9)
train(model=model,
criterion=criterion,
optimizer=optimizer,
lr=lr)