我正在使用Keras(带有tensorflow后端)并试图在列车时间内使用“适合”功能在我的训练集上获得图层输出(实际激活)
有没有什么方法可以将最后一批用于训练的激活作为on_batch_end回调的一部分?或任何其他方式来访问图层输出?
我在下面找到了这个代码,但它在新数据上再次运行了正向传递。我正在尝试利用这样一个事实,即我的网络已经作为批量培训的一部分进行了正向传递,只是拉动当前的激活,这是可能的吗?
def get_activations(model, model_inputs, print_shape_only=False, layer_name=None):
print('----- activations -----')
activations = []
inp = model.input
model_multi_inputs_cond = True
if not isinstance(inp, list):
# only one input! let's wrap it in a list.
inp = [inp]
model_multi_inputs_cond = False
outputs = [layer.output for layer in model.layers if
layer.name == layer_name or layer_name is None] # all layer outputs
funcs = [K.function(inp + [K.learning_phase()], [out]) for out in outputs] # evaluation functions
if model_multi_inputs_cond:
list_inputs = []
list_inputs.extend(model_inputs)
list_inputs.append(0.)
else:
list_inputs = [model_inputs, 0.]
# Learning phase. 0 = Test mode (no dropout or batch normalization)
# layer_outputs = [func([model_inputs, 0.])[0] for func in funcs]
layer_outputs = [func(list_inputs)[0] for func in funcs]
for layer_activations in layer_outputs:
activations.append(layer_activations)
if print_shape_only:
print(layer_activations.shape)
else:
print(layer_activations)
return activations