解决方案:
生成训练集
traincol1 = linspace(0.1, 15, 40)';
eps = (0.2*rand(40,1)) - 0.1;
traincol2 = sin(traincol1)./traincol1 - eps;
train = [traincol1 traincol2];
save('snn_a.txt','train');
save('snn_a.mat','train');
生成测试集
testcol1 = linspace(0.1, 15, 400)';
eps = (0.2*rand(400,1)) - 0.1;
testcol2 = sin(testcol1)./testcol1 - eps;
test = [testcol1 testcol2];
save('snn_b.txt','test');
save('snn_b.mat','test');
训练神经网络
function net = train_net(trainingset, hidden_neurons)
% Parameters:
% train_set:
% labels - y
% hidden_neurons_count:
% Return value:
% net – object representing a neural network
% initialization
% hidden neuron activation function- tanh,
% output neuron activation - linear
net=newff(trainingset(:, 1)', trainingset(:, 2)',hidden_neurons,
{'tansig', 'purelin'},'trainlm');
rand('state',sum(100*clock)); %random numbers generator initialization
net=init(net); %weights initialization
net.trainParam.goal = 0.01; %stop- mse criterion
net.trainParam.epochs = 400; %number of epochs iterations
net=train(net,trainingset(:, 1)', trainingset(:, 2)'); %network training
主程序
% input data area
load('snn_a.mat');
load('snn_b.mat');
hidden_neurons = 4;
% net training
net = train_net(train, hidden_neurons);
% assigning results
resulttrain = net(train(:, 1)')';
resulttest = net(test(:, 1)')';
% drawing
hold on
sn = @(x) sin(x) / x;
fplot(sn, [0, 15],'g');
plot(train(:, 1), resulttrain, 'r');
legend('Original function', ' Result')
hold off
% print mse results
mse(net, train(:, 2)', resulttrain')
mse(net, test(:, 2)', resulttest')
您能解释一下train_net()和主程序吗?
有什么方法可以改进吗?
答案 0 :(得分:2)
没什么可说的。
train_net基本上使用函数newff
创建一个前馈反向传播网络,其中包含给定参数(隐藏神经元的数量,时代数,错误目标......),它使用您的训练数据集来训练神经元(调整神经元的重量)。
然后你的主程序使用经过训练的神经网络来获得火车组和测试组的预测。
它最终将完美的预期结果与列车和测试集的结果进行对比,以显示网络的执行情况。
最后,它计算了mse的性能数值分析。