Neural Network applied to compute square root

9 views (last 30 days)
Hello to the community,
I have recently joined the world of Neural Network, in order to have a better understanding I have done on my own a feed-forward and a backpropagation with momentum.
It looks ok. But now I would like to apply this to not-binary data, I mean for example I would like the NN to learn the square root. I tried to normalize my data in input and output but the results is quite bad.
So if someone could help me with that ..
A = [1;4;9;16;25;36;49;64;81;100]; % Inputs you will used to train the NN
B = [1;2;3;4;5;6;7;8;9;10]; % Outputs you will used to train the NN
RealGuess = [5;1;200;4;8;54;23;15;99;65]; % The data for which you want to used the NN
% Normalization !!
% --------------
A2 = (A(:) - min(A(:)))/max(A(:)-min(A(:)));
B2 = (B(:) - min(B(:)))/max(B(:)-min(B(:)));
numIn = length (A(:)); % Size of the input column from the input file
% you gave. It is the number of neurons will
% used
bias=-1; % bias, threshold
disp('Enter learning rate');
lr=input('learning rate lr='); % learning rate
weights = -1*2.*rand(3,1); % weights, computed randomly for a first guess
disp('Enter iterations');
iterations=input('iterations ='); % Number of iterations before stopping
Error_Tol = 10; % Initial dummy error tolerance
counter = 0; % Initialization of the counter of iterations
disp('The tolerance (condition of stopping');
Tol=input('Tolerance Tol=');
%--------------------------------------------------------------------------
% Training of the NN
%--------------------------------------------------------------------------
% Set up of the plot ..................................................
figure , hold on;
title('Error between the output from the trained NN and the desired output'...
,'fontsize',14);
xlabel('Iterations','fontsize',13);
ylabel('Error','fontsize',13);
% .....................................................................
% The heart of the algo. is just underneath
while(counter<=iterations)&&(Error_Tol>Tol) % Condition of stopping, if
% we reach the maximum of
% iterations AND we reached
% the error tolerance we
% planned.
out = zeros(4,1); % initialization of the output layer
for j = 1:numIn % Loop over all the neurons
y = bias*weights(1,1) + A2(j,1)*weights(2,1);
out(j) = 1/(1+exp(-y));
delta = B2(j)-out(j);
weights(1,1) = weights(1,1)+lr*bias*delta;
weights(2,1) = weights(2,1)+lr*A2(j,1)*delta;
end
% Error computation
Error =((out - B)); % error for the 4 neurons
plot(counter, Error,'o'); % plot them
Error_Tol= 0.5 * (Error_Tol + sum(Error.^2)); % compute the average of
% the error tolerance
% .....................................................
counter = counter + 1; % counter used to count the number of iteration,
% we use it for evaluate the while condition.
end
disp('Training......................................DONE');
%--------------------------------------------------------------------------
% Use of the Neural Network
% -------------------------
% We use the weights computed previously (which means the trained NN) with
% a new set of input data, which will give to us an output data.
%--------------------------------------------------------------------------
trained = zeros(4,1);
for j = 1:numIn
train = bias*weights(1,1)+...
RealGuess(j,1)*weights(2,1);
trained(j) = 1/(1+exp(-train));
end
% -------------------------------------------------------------------------
% Various Display at the end of the RUN
% -------------------------------------------------------------------------
%A = load ('InputAndExpectedOutput.txt');
input_data = A(:);
disp('Using of the trained Neural Network...........DONE');
disp('Results/Outputs for the real guess via the trained Neural Network:');
trained
%trainedNorm=2*trained
% Normalization in the other way around
% -------------------------------------
trainedNorm = (max(input_data(:))- min(input_data(:)))*trained + ...
min(input_data(:));
trainedNorm
  3 Comments
Alex
Alex on 12 Mar 2014
Edited: Alex on 12 Mar 2014
Thanks Greg for your comment.
No I don't have the NN toolbox, I suppose it is easy to do with the toolbox ?
Yes indeed I get your point for 200, but you can just copy/paste the code and you will see the bad results, it is even not a question of precision but really I am doing something wrong in the way of making the normalization ..

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 16 Mar 2014
0. NOTATION: (input,hidden,target,output) = (x,h,t,y)
1. Probably need more than N = 10 data points
2. t(3) = 200 is an outlier. net is designed for [1,100]
3. In order to detect outliers more readily, I recommend using ZSCORE to normalize to zero-mean/unit-variance. ( The MATLAB default is [-1,1] ). In either case use the bipolar sigmoid tanh, for hidden neuron transfer functions.
4. lr = ?
5. The smallest two layer net, ( one input unit, one hidden neuron, one output neuron ) will have Nw = 4 weights not, Nw = 3.
h = tanh( b1 + W1*x );
y = b2 + W2*h;
6. All initial weights should be small, random and bipolar; e.g., 0.01*randn(Nw,1)
7. One hidden neuron (as well as N=10) is woefully insufficient for this problem
8. iterations = ?
9. If min(y(unnormalized)) = 1, why would you tolerate Error_Tol = 10?
10. y = zeros(4,1) makes no sense. You have N = 10 1-dimensional examples.
11. for j = 1:numIn % Loop over all the neurons
No. In general you should have a TRIPLE loop. The outer loop is over iterations, the middle loop is over examples and the inner loop is over weights. HOWEVER, in your case you only have 4 weights so you don't need an inner loop.
That's enough for now.
I have posted several backpropagation code examples. However, I lost my copies when my computer crashed. Luckily, I found one in the NEWSGROUP for XOR that I modified from a poster. I don't know if it will help:
%Subject: Backpropagation Neural network Code problem
% From: Greg Heath
% Date: 26 Mar, 2009 11:50:30
% Message: 46 of 71
% Modification of code from Adeel Raza
*Thank you for formally accepting my answer.
Greg

More Answers (1)

Alex
Alex on 20 Mar 2014
Edited: Alex on 20 Mar 2014
Thanks Greg, I will have a close loop on that. I have to admit that I don't understand your code examples easily.
Thank you again.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!