我正在读喀拉拉邦的代码,但我听不懂一行。代码如下:
from keras import initializers
from keras.layers import Dense, Concatenate
from keras.layers import Input
m = 1
n = 1
N = 20
a = Input(shape=(m,))
b = Input(shape=(m,))
c = Input(shape=(1,))
d = Input(shape=(1,))
inputs = [a, b, d, c]
output_state = []
layers = []
for j in range(N):
for i in range(n):
layer = Dense(m, activation='tanh', trainable=True, kernel_initializer=initializers.RandomNormal(0, 1),
bias_initializer='random_normal', name=str(i) + str(j))
layers = layers + [layer]
for j in range(N):
helper1 = Concatenate()([a, b])
for i in range(n):
temp = layers[i + j * n](helper1)
我不明白的那句话是最后一条。他们定义了一个神经网络,使得每一层都有一个尺寸为2 *(m + 1)的输入。部分
(helper1)
在我看来是该层之一的输入,这没有意义,因为助手没有尺寸2 *(m + 1)。因此,我想问一下代码的最后一行是什么情况。