For my project work I have used Elman neural network, with Resilient back propagation algorithm, Nguyen widrow algorithm for generating initial layer values. I observed lot difference between outputs in different trials

1 view (last 30 days)
For my project work I have used Elman neural network(ENN), with Resilient back propagation algorithm, Nguyen widrow algorithm for generating initial layer values. I observed lot of difference between outputs for different trial, when for the first time I trained network it gave 94% accuracy and the second time with same inputs and targets I got 64% only. After training for the first time I didn't saved the network. please suggest me ways to avoid the difference between consecutive trials. I am using Matlab 2010 and I created ENN using nntool, and then using code I turned it into 'trainrp' as creating ENN with 'trainrp' gave me error.

Accepted Answer

Walter Roberson
Walter Roberson on 25 Mar 2014
Weights are initialized randomly for Neural Networks unless you initialize them manually. If you are getting as large of a difference as you are seeing, it should suggest to you that your network is not robust.
You can control the random number seed to reproduce particular networks. In R2010 you should see the documentation for the randstream class; R2011a or so introduced rng() as a simpler way to set the random seed.
  1 Comment
Balaji
Balaji on 25 Mar 2014
I used Nguyen widrow algorithm to initialize weights I used net = initnw(net,layer); but still the result was havind great variations.

Sign in to comment.

More Answers (1)

Greg Heath
Greg Heath on 26 Mar 2014
Search for one of my design examples
greg net rng(0)
greg net rng(4151941)
greg net rng(default)
As Walter has mentioned, there are probably more up-to-date ways of initializing the RNG.

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!