HELP - Neural Network - Incredible sim(net,input) results - Why ?

2 views (last 30 days)
Hello, I have a problem I cannot understand
Once I have trained my network I use the sim(net, input) function to get the results The results are incredible, and they are diffrent than the result obtained by manual matrix calculation using net.IW{1} etc. I obtain result around 10^4 with the sim, whereas result are around 1 with the matrix calculation !
Here is a zip file of the workspace and the code that provide what I am talking about
I also copy paste the code here in case you want have a quick look
% in the workspace 'imp' is the data imput (7 variables for each input) and
% 'targ' is the data target
sz = size (imp);
% numbers of the coloumns for separation between train set, validation set, test set.
d1=round(sz(2)/2); % the half of the dataset
d2=round((sz(2)-d1)/2)+d1; % the half of the remaining part i.e. the quart
d3=sz(2); % the last quart
% network with 1 input layer (size of imput is 7) , one hidden layer of 5
% neurons, and one unique neurone in the output layer
net = newff(imp,targ,5);
% actual separation for train test and validation set
imp1=imp(:,1:d1); % imput for training
targ1=targ(:,1:d1); % target for training
VV.P=imp(:,d1+1:d2); % validation set
VV.T=targ(:,d1+1:d2);
VT.P=imp(:,d2+1:d3); % test set
VT.T=targ(:,d2+1:d3);
net.inputweights{1,1}.initfcn = 'rands';
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'purelin';
net = init(net);
train(net,imp1,targ1,[],[],VV,VT);
% simulation on the full dataset
y1 = sim(net,imp);
% bias of layers addapted for direct calculation with the size of the dataset
B1 = net.b{1}*ones(1,size(imp,2)); % all the coloumns are identical, and equal to net.b{1}
B2 = net.b{2}*ones(1,size(imp,2));
OutLayer1 = tansig(net.IW{1}*imp+B1); % output from the layer 1 (the hidden layer)
OutLayer2 = purelin(net.LW{2}*OutLayer1+B2); % output from the layer 2 which is the output layer
y2 = OutLayer2; % just to give an easy name
% now you can compare y1 and y2
plot(1:d3,y1,'o',1:d3,y2,'x');
% NOTE THE *10^4 in the Y axis
  1 Comment
Greg Heath
Greg Heath on 21 Nov 2011
What is the size of your data?
input has an "n" not an "m"
columns only has one "o"
initialize rand before calling newff
delete the 4 commands
net.inputweights{1,1}.initfcn = 'rands';
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'purelin';
net = init(net);
they are automatically covered by the call to newff
newff automatically scales the data. Include the scaling in
your direct calculation.
Hope this helps.
Greg

Sign in to comment.

Accepted Answer

Phillip
Phillip on 18 Nov 2011
I had the same problem however it is not a matlab nntool error but rather the fact that the network you are using uses some pre and post processing steps.
Namely the Input and Output data range is set between [-1 1], this is mostly to prevent rapid over-saturation of the sigmoid function. So to create your own sim equivalent you need to also perform these pre and post processing steps.These require the range of each input and output stored in net.inputs.range and net.outputs.range
It is a little frustrating that this is not really explicitly mentioned.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!