A DIFFICULT NEURAL NETWORK

3 views (last 30 days)
laplace laplace
laplace laplace on 28 Apr 2013
i need help on how to design the followin network, i ll appreciate some code since i am raelly new on both NN and matlab thanx in advance
0)i got 50 4x1 matrices that i wanna target to 1 and 0 (50 data)
1) i use 35 to train the network 15* to test
2)i take these 35 data and split them into 7 folds of 5 data each, lets say:
i=1,..,n=7 from that 7 fold i pick a random fold to test/validate and keep the rest to train the network and i do this 7 times for each fold
3) so now i have created 8 networks: the original and onother 7 due to the data partition i made

Accepted Answer

Greg Heath
Greg Heath on 1 May 2013
Did you correct my numbering so that the second 2 is 3 and the 4 is 5?
MSEtrn00 = mean(var(ttrn',1))
MSEgoal = MSEtrn00/100
MinGrad = MSEtrn00/300
rng(0)
for h = Hmin:dH:Hmax % e.g., 1:10
for n = 1: Ntrials % e.g., 1:10
net = patternnet(xtrn,ttrn,h);
net.divideFcn = 'dividetrain';
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MinGrad;
[net tr ] = train(net,xtrn,ttrn);
bestepoch = tr.best_epoch;
R2(n,h) = 1 - tr.perf(bestepoch)/MSEtrn00;
end
R2 = R2
% The goal is R2 >= 0.99. Find the smallest h to yield acceptable results.
For variance updating check a statistics book or work it out yourself. The mean updating is given by
meanx(i+1) = meanx(i) + [x(i)-meanx(i)]/(i+1)
Hope this helps.
  • Thank you for formally accepting my answer*
Greg
  2 Comments
laplace laplace
laplace laplace on 1 May 2013
wow thank you that helped a lot! as for the 1st step of yours my problem is the following!!
i wanna do this 5 fold cross validation not to all 50 data but to the 35 remaining, how can i manipulate this 35 data after the train process of (my) step 1 ?
Greg Heath
Greg Heath on 2 May 2013
I do not understand why you would want to withhold 30% of an already small data set without repetition. You may have size problems even with all 50. With small numbers even small changes can make a significant difference. Therefore, the smaller the numbers, the more repetitions you will need to get the same confidence bounds.
My suggestion is to use H = 1:8, Ntrials = 10 (80 designs) with all 50 to determine how small you can afford to have H. With 4 inputs and 1 output you will then have Nw = (I+1)*H+(H+1)*O = O+(I+O+1)*H = 1+6*H unknown weights which will be less than your 50 equations if H <= 8. With 35 equations, H <= 5. Don't forget to use trainbr.
Then using the selected H, perform f-fold XVAL on all 50. Repeat M times until the updated mean and standard deviation stabilize.
Regardless if N = 50 or 35, choose your subsets using randperm(N) for each of the M repetitions. Only initialize your random number generator once (at the very beginning)so that you can always duplicate previous efforts.
Hope this helps.
Greg

Sign in to comment.

More Answers (7)

Greg Heath
Greg Heath on 29 Apr 2013
Performance estimates from training and validation subsets are biased, especially from small data subsets. Try to use subsets with at least 10 data points.
You may not have enough data to obtain reliable results with a validation set.
1. Try 5-fold cross-validation with 4 training subsets and 1 test subset.
2. Try to use the smallest number of hidden nodes that will yield reasonable results.
2. Omit the validation set (trn/val/tst/ = 4/0/1) but either use
a. msereg
b. The regularization option of mse
or
c. trainbr
3. Obtain and store the average and standard deviations of MSEtrn and MSEtst
4. Randomize the data and repeat this procedure M times until the updated mean and standard deviation of the 5*M estimates stabilize.
If you feel that you must use validation stopping, repeat the above with a 3/1/1 split with and/or without regularization.
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 Comment
laplace laplace
laplace laplace on 30 Apr 2013
can you provide me with some code regarding steps 2 and 5?

Sign in to comment.


Greg Heath
Greg Heath on 10 May 2013
A DIFFICULT NEURAL NETWORK
Asked by laplace laplace on 28 Apr 2013 at 15:50
Latest activity by laplace laplace on 9 May 2013 about 3:00
%1st step: i wanna train a NN with the patternet algorith, data and targets not shown here!
hiddenLayerSize = 1;
1. Why H =1?
net = patternnet(hiddenLayerSize);
% net.divideParam.trainRatio = 70/100;
% net.divideParam.valRatio = 15/100;
% net.divideParam.testRatio = 15/100;
2. Why specify default division ratios?
3. Why accept the default 'dividerand'?
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
4.Why isn't tr used to find separate trn/val/tst indices and performance results?
% View the Network
view(net)
%2nd step ****at this point i want to use the trainning set of step 1 and
apply to it 5-fold cross validation
5. WHY? This makes no sense.
%the problem here is: 1) how to imply that i use the trainning set of step1
%and 2) : mistakes in code
Indices = crossvalind('Kfold',inputs , 5);
6. WHERE IS THIS FUNCTION?
for i=1:5
test = (Indices == i);
train = ~test;
for i = 1:5 %
7. Cannot use i for both loops. Do you mean n?
8. You are not using the k-fold data division to get different data for different loops
net = patternnet(inputs,targets,h); %test train
net.divideFcn = 'dividetrain';
9. This forces all data to be in the training set ??
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MinGrad;
[net,tr] = train(net,inputs,targets); % test train
bestepoch = tr.best_epoch;
R2(n,h) = 1 - tr.perf(bestepoch)/MSEtrn00;
end
10. Missing another end
  1 Comment
laplace laplace
laplace laplace on 22 May 2013
can you fix me steps 6 and 8? please!! it ll take you 5min i really cant think anythink else :(

Sign in to comment.


laplace laplace
laplace laplace on 7 May 2013
Edited: laplace laplace on 7 May 2013
hiddenLayerSize = 1;
net = fitnet(hiddenLayerSize);
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% View the Network
view(net)
% ****at this point i want to imply that the new data is the trainning set above how can i do that?***
Indices = crossvalind('Kfold',inputs , 5);
for i=1:5
test = (Indices == i);
train = ~test;
for i = 1:5 %
net = patternnet(inputs,targets,h); %test train
net.divideFcn = 'dividetrain';
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MinGrad;
[net,tr] = train(net,inputs,targets); % test train
bestepoch = tr.best_epoch;
R2(n,h) = 1 - tr.perf(bestepoch)/MSEtrn00;
end
some advice/thoughts on that?
  1 Comment
Greg Heath
Greg Heath on 8 May 2013
I have no idea what you are trying to do. There are too many mistakes and no comments.
Please review, revise, insert comments and repost.

Sign in to comment.


laplace laplace
laplace laplace on 9 May 2013
%1st step: i wanna train a NN with the patternet algorith,
data and targets not shown here!
hiddenLayerSize = 1;
net = patternnet(hiddenLayerSize);
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
% View the Network
view(net)
%2nd step ****at this point i want to use the trainning set of step 1 and apply to it 5-fold cross validation
%the problem here is: 1) how to imply that i use the trainning set of step1
%and 2) : mistakes in code
Indices = crossvalind('Kfold',inputs , 5);
for i=1:5
test = (Indices == i);
train = ~test;
for i = 1:5 %
net = patternnet(inputs,targets,h); %test train
net.divideFcn = 'dividetrain';
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MinGrad;
[net,tr] = train(net,inputs,targets); % test train
bestepoch = tr.best_epoch;
R2(n,h) = 1 - tr.perf(bestepoch)/MSEtrn00;
end

laplace laplace
laplace laplace on 12 May 2013
how can i fix 4,6,8,9 i am stuck

laplace laplace
laplace laplace on 14 May 2013
any ideas?

laplace laplace
laplace laplace on 22 May 2013
can you fix me steps 6 and 8? please!! it ll take you 5min i really cant think anythink else :(

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!