怎么啦SimpleRNN更新权重

时间:2019-12-03 09:35:46

标签: python keras deep-learning recurrent-neural-network

我正在发现深度学习和RNN。

任何人都可以向我解释喀拉拉邦如何。在此示例中,SimpleRNN更新权重:

型号:Y [t] = WXX [t] + WHY [t-1]

隐藏状态和输入的初始重量为1

没有激活功能

损失函数均方误差

一个时代

适合X = [1,2,3,4],Y = 5

我的第二个问题是损失函数= 25(最后一个T的平方误差) 但这必须是所有平方误差的总和:每个T

代码


Import relevant classes/functions

from keras.preprocessing.text import Tokenizer

from keras.preprocessing.sequence import pad_sequences

from keras.models import Sequential

from keras.layers import Input, Concatenate, Dense ,SimpleRNN


import numpy as np
import pandas as pd
from keras import optimizers
import keras

model = Sequential()
model.add(SimpleRNN(units=1,activation = None , inputshape=(None, 1), usebias = False, kernelinitializer = keras.initializers.Ones() , recurrentinitializer = keras.initializers.Ones()))

model.add(Dense(1, activation='sigmoid'))
sgd = optimizers.SGD(lr=1)
model.compile(loss='meansquarederror', optimizer=sgd,metrics=['accuracy'])
print("weights :")
print(model.getweights()) xtrain = np.array([[1,2,3,4]])
y_train = [5]

model.fit(xtrain.reshape( xtrain.shape[0] , xtrain.shape[1] ,1 ), ytrain , epochs=1, batch_size=1, verbose=True)

print(model.get_weights())

0 个答案:

没有答案