此脚本使用functional API
定义虚拟对象from keras.layers import Input, Dense
from keras.models import Model
import keras
inputs = Input(shape=(100,), name='A_input')
x = Dense(20, activation='relu', name='B_dense')(inputs)
shared_l = Dense(20, activation='relu', name='C_dense_shared')
x = keras.layers.concatenate([shared_l(x), shared_l(x)], name='D_concat')
model = Model(inputs=inputs, outputs=x)
print(model.summary())
产生以下输出
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
A_input (InputLayer) (None, 100) 0
____________________________________________________________________________________________________
B_dense (Dense) (None, 20) 2020 A_input[0][0]
____________________________________________________________________________________________________
C_dense_shared (Dense) (None, 20) 420 B_dense[0][0]
B_dense[0][0]
____________________________________________________________________________________________________
D_concat (Concatenate) (None, 40) 0 C_dense_shared[0][0]
C_dense_shared[1][0]
====================================================================================================
我的问题涉及Connected to
列的内容。
我理解a layer can have multiple nodes。
在这种情况下,C_dense_shared
有两个节点,D_concat
连接到它们(C_dense_shared[0][0]
和C_dense_shared[1][0]
)。所以第一个索引(node_index
)对我来说很清楚。但第二个指数意味着什么?从source code我读到这是tensor_index
:
layer_name[node_index][tensor_index]
但tensor_index
是什么意思?在什么情况下,它的值可能与0
不同?
答案 0 :(得分:3)
我认为Node
类的docstring非常明确:
tensor_indices: a list of integers,
the same length as `inbound_layers`.
`tensor_indices[i]` is the index of `input_tensors[i]` within the
output of the inbound layer
(necessary since each inbound layer might
have multiple tensor outputs, with each one being
independently manipulable).
如果图层具有多个输出张量,则 tensor_index
将为非零值。它与多个“数据流”(例如图层共享)的情况不同,其中图层具有多个出站节点。例如,LSTM
图层将返回3个张量,如果给定return_state=True
:
return_sequences=True
另一个例子是,功能转换可以实现为Lambda
层:
def generate_powers(x):
return [x, K.sqrt(x), K.square(x)]
model_input = Input(shape=(10,))
powers = Lambda(generate_powers)(model_input)
x = Concatenate()(powers)
x = Dense(10, activation='relu')(x)
x = Dense(1, activation='sigmoid')(x)
model = Model(model_input, x)
从model.summary()
,您可以看到concatenate_5
已与lambda_7[0][0]
,lambda_7[0][1]
和lambda_7[0][2]
相关联:
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
input_7 (InputLayer) (None, 10) 0
____________________________________________________________________________________________________
lambda_7 (Lambda) [(None, 10), (None, 1 0 input_7[0][0]
____________________________________________________________________________________________________
concatenate_5 (Concatenate) (None, 30) 0 lambda_7[0][0]
lambda_7[0][1]
lambda_7[0][2]
____________________________________________________________________________________________________
dense_8 (Dense) (None, 10) 310 concatenate_5[0][0]
____________________________________________________________________________________________________
dense_9 (Dense) (None, 1) 11 dense_8[0][0]
====================================================================================================
Total params: 321
Trainable params: 321
Non-trainable params: 0
____________________________________________________________________________________________________