我正在尝试使用Matlab的神经网络工具箱来模拟差分方程。等式的形式为:
y [k + 1] = y [k] + d_t *(-a * y [k] + x [k])
a 和 d_t 由NN中的2个可调权重表示。 a 是第一层的权重,第二层的两个权重都等于1,第三层(无延迟权重)是 d_t ,最后一个权重是1
产生网络的代码是:
%% input
t = 1:100;
x = ones(1,length(t));
%% difference equation
% consntants
d_t = 0.1;
a = 1;
% initialization
y0 = 0;
% output signal
y = zeros(length(t), 1);
y(1) = y0;
for ii = 1:(length(t)-1)
y(ii+1) = y(ii) + d_t * (-a*y(ii) + x(ii));
end
% input
plot(t, x, 'b--')
hold on
% difference equation output
plot(t, y, 'r' )
%% NN
net = network;
% structure
net.numInputs = 1;
net.numLayers = 3;
% neurons
net.inputs{1}.size = 1;
net.layers{1}.size = 1;
net.layers{2}.size = 1;
net.layers{3}.size = 1;
%names
net.inputs{1}.name = 'x';
net.layers{1}.name = '1st';
net.layers{2}.name = '2nd';
net.layers{3}.name = '3rd';
net.outputs{3}.name = 'y';
% connections
net.biasConnect = [0; 0; 0]; % exclude biases
net.inputConnect = [0; 1; 0]; % input connected to 2nd layer
net.layerConnect = [0 0 1; ... % 3rd to 1st {1,3}
1 0 0; ... % 1st to 2nd
0 1 1]; % 2nd to 3rd; 3rd to 3rd
net.outputConnect= [0 0 1]; % 3rd layer connected to output
% set the weights
% input weight
net.IW{2,1} = 1;
net.inputWeights{2,1}.learn = 0;
% 3rd to 1st
net.LW{1,3} = 1; % value
net.layerWeights{1,3}.learn = 0; % do not change
% 1st to 2nd
net.LW{2,1} = -1; % TO BE TRAINED
net.layerWeights{2,1}.learn = 1; % train
% 2nd to 3rd
net.LW{3,2} = 0.1; % delta T (finite time difference)
net.layerWeights{3,2}.learn = 0; % do not change
% 3rd to 3rd
net.LW{3,3} = 1; % value
net.layerWeights{3,3}.learn = 0; % do not change
% add delay
net.layerWeights{1,3}.delays = 1;
net.layerWeights{3,3}.delays = 1;
view(net)
hold on;
plot(net(x),'g--')
但是结果看起来像这样:
似乎延迟被忽略,输入信号仅通过常数 d_t = 0.1。
有什么想法,为什么先前的输出似乎没有进入第一层和第二层?