Equation From a simple feedforward neural network
6 views (last 30 days)
Show older comments
Mohamed BENALLAL
on 16 Apr 2016
Commented: Mohamed BENALLAL
on 19 Apr 2016
Hi every one, I'm working on a code which provide the full equation from a FeedForword Neural Network (FNN) in a text file consedering all weights and biases : I have already the trained FNN stored (the "net" file), the first step is to see if I have the same result when using :
load Net17 net
input = [12,0.2]; % an input example
output = net(input');
and when I do this :
IW = net.IW{1,1} ;
b1 = net.b{1};
b2 = net.b{2};
LW = net.LW{2,1};
y = b2 + LW * tansig( IW * input' +b1 );
Note that my net is a simple FNN with one hiden layer; 2 input neurons, 17 hiden neurons (for this example) and one output neuron.
I don't know if a did a mistake but the result is diffrent :
output = 353.3947
y = -7.8709
any suggestions ??? Thanks...
References :
http://fr.mathworks.com/help/nnet/ug/multilayer-neural-network-architecture.html?refresh=true http://fr.mathworks.com/help/nnet/ref/setwb.html http://fr.mathworks.com/help/nnet/ref/tansig.html http://fr.mathworks.com/matlabcentral/answers/165233-neural-network-how-does-neural-network-calculate-output-from-net-iw-net-lw-net-b
0 Comments
Accepted Answer
Greg Heath
on 19 Apr 2016
You forgot that the net uses the default MAPMINAX to normalize the input and target before training and, then, to denormalize the output.
Hope this helps.
Thank you for formally accepting my answer
Greg
More Answers (0)
See Also
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!