我试图找出犯错误的地方。如果你能帮助我,我会很高兴。
这是我的问题:
连续火车,从神经网络工具箱,功能表现一种方式,但当我把它放在一个parfor循环时,一切都变得疯狂。
>> version
ans =
8.3.0.532 (R2014a)
这是一个功能
function per = neuralTr(tSet,Y,CrossVal,Ycv)
hiddenLayerSize = 94;
redeT = patternnet(hiddenLayerSize);
redeT.input.processFcns = {'removeconstantrows','mapminmax'};
redeT.output.processFcns = {'removeconstantrows','mapminmax'};
redeT.divideFcn = 'dividerand'; % Divide data randomly
redeT.divideMode = 'sample'; % Divide up every sample
redeT.divideParam.trainRatio = 80/100;
redeT.divideParam.valRatio = 10/100;
redeT.divideParam.testRatio = 10/100;
redeT.trainFcn = 'trainscg'; % Scaled conjugate gradient
redeT.performFcn = 'crossentropy'; % Cross-entropy
redeT.trainParam.showWindow=0; %default is 1)
redeT = train(redeT,tSet,Y);
outputs = sim(redeT,CrossVal);
per = perform(redeT,Ycv,outputs);
end
以下是我输入的代码:
Data loaded in workspace
whos
Name Size Bytes Class Attributes
CrossVal 282x157 354192 double
Y 2x363 5808 double
Ycv 2x157 2512 double
per 1x1 8 double
tSet 282x363 818928 double
在Serial
中执行的功能per = neuralTr(tSet,Y,CrossVal,Ycv)
per =
0.90
开始并行
>> parpool local
Starting parallel pool (parpool) using the 'local' profile ... connected to 12 workers.
ans =
Pool with properties:
Connected: true
NumWorkers: 12
Cluster: local
AttachedFiles: {}
IdleTimeout: Inf (no automatic shut down)
SpmdEnabled: true
并行执行12次函数
per = cell(12,1);
parfor ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
per =
[0.96]
[0.83]
[0.92]
[1.08]
[0.85]
[0.89]
[1.06]
[0.83]
[0.90]
[0.93]
[0.95]
[0.81]
再次执行以查看随机初始化是否带来不同的值
per = cell(12,1);
parfor ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
per =
[0.96]
[0.83]
[0.92]
[1.08]
[0.85]
[0.89]
[1.06]
[0.83]
[0.90]
[0.93]
[0.95]
[0.81]
编辑1: 仅使用for
运行该功能per = cell(12,1);
for ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
per =
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
编辑2: 我修改了我的功能,现在一切都很棒。也许问题是数据并行划分时。所以我在发送之前将数据分成并行。 Tks很多
function per = neuralTr(tSet,Y,CrossVal,Ycv)
indt = 1:round(size(tSet,2) * 0.8) ;
indv = round(size(tSet,2) * 0.8):round(size(tSet,2) * 0.9);
indte = round(size(tSet,2) * 0.9):size(tSet,2);
hiddenLayerSize = 94;
redeT = patternnet(hiddenLayerSize);
redeT.input.processFcns = {'removeconstantrows','mapminmax'};
redeT.output.processFcns = {'removeconstantrows','mapminmax'};
redeT.divideFcn = 'dividerand'; % Divide data randomly
redeT.divideMode = 'sample'; % Divide up every sample
redeT.divideParam.trainRatio = 80/100;
redeT.divideParam.valRatio = 10/100;
redeT.divideParam.testRatio = 10/100;
redeT.trainFcn = 'trainscg'; % Scaled conjugate gradient
redeT.performFcn = 'crossentropy'; % Cross-entropy
redeT.trainParam.showWindow=0; %default is 1)
redeT = train(redeT,tSet,Y);
outputs = sim(redeT,CrossVal);
per = zeros(12,1);
parfor ii = 1 : 12
redes = train(redeT,tSet,Y);
per(ii) = perform(redes,Ycv,outputs);
end
end
结果:
>> per = neuralTr(tSet,Y,CrossVal,Ycv)
per =
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
答案 0 :(得分:2)
哦!我想我找到了它,但不能测试它。
你的代码中有:
redeT.divideFcn = 'dividerand'; % Divide data randomly
如果每个工作人员随机选择数据,那么预计他们会得到不同的结果,不是吗?
尝试下一个:
per = cell(12,1);
parfor ii = 1 : 12
rng(1); % set the seed for random number generation, so every time the number generated will be the same
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
不确定neuralTr
是否确实将种子设置在里面,但是试一试。