cross validation in neural network

4 views (last 30 days)
FZ
FZ on 11 Jul 2012
Edited: Greg Heath on 23 Feb 2018
hai
i need some clarification on cross validation to be applied to neural network. i manage to get result of NN. right now i plan to apply cross validation for model selection.
i have go through example of *crossvalind, crossval* but i dont really understand what is classifier,in other word, what are the main things to be considered in order to apply cross validation.
TIA,Regards, FZ
  1 Comment
Greg Heath
Greg Heath on 14 Jul 2012
In classification or pattern recognition, the data is assumed to be partitioned into c classes. The job of the classifier is to correctly assign the input vector to the correct class. See the patternnet demo.

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 14 Jul 2012
What do you mean by "model selection" ... making a choice between newrb and fitnet(regression) or patternnet(classification)? Or, given one of them with one hidden layer, choosing the minimum number of hidden nodes that can achieve the design goal?
I do not have crossvalind and haven't figured out how to use crossval for neural nets yet.
If I were in a hurry, I would just use randperm(N) to randomly divide the N cases of input/target pairs into 10 mutually exclusive subsets. Then use subset i (i=1:10), for testing, subset j (j ~= i) for validation and the remaining eight subsets for training. There is no need to shuffle data around because it can all be done with indexing.
With 10-fold XVAL there are 10*9 = 90 combinations for validation and test subset pairs. However, only 10 are needed. Therefore, there is no need to use both combinations (i,j) and (j,i). This reduces the number to 45. However, since the subsets are random, it is sufficient to use j = mod(i+1,10).
I would then tabulate the separate train/val/test performances as well as their summary statistics.
  2 Comments
Greg Heath
Greg Heath on 14 Jul 2012
If N is small, you may want to average over more than 10 of the 45 10-fold combinations or just use more than 10 folds.
Make sure that each design begins with a differet set of random initial weights.
Md
Md on 27 Sep 2013
Hello Mr. Greg, Can you please provide the complete code for the following portion
"If I were in a hurry, I would just use randperm(N) to randomly divide the N cases of input/target pairs into 10 mutually exclusive subsets. Then use subset i (i=1:10), for testing, subset j (j ~= i) for validation and the remaining eight subsets for training. There is no need to shuffle data around because it can all be done with indexing."
It will be really great help for me. Again Thanks

Sign in to comment.

More Answers (1)

ap hossain
ap hossain on 23 Feb 2018
hello sir, i am little bit confused whether cross validation should apply in nueral network or not needed...i am using matlab 2017a..via default nftool i'm getting better result and when save script there is no cross validation function in default program..so do i need cross validation?
  1 Comment
Greg Heath
Greg Heath on 23 Feb 2018
Edited: Greg Heath on 23 Feb 2018
1. You have asked a NEW question in an OLD ANSWER BOX.
2. Since the original question is 6 years old you should
have started a new post.
3. I DO NOT RECOMMEND k-fold (Typically with k = 10)
cross validation for neural net design.
REASONS
a. It is not a NN Toolbox option
c. It is too time-consuming to either
i. learn how to modify the XVAL code from other
toolboxes.
ii. write your own error-free code.
d. It is easier to just loop through multiple designs with
both
i. Random data division
ii. Random weight initialization
e. There are hundreds of examples of d in both the
NEWSGROUP and ANSWERS. Good search words are
greg Ntrials
PS. I have posted tens of cases using XVAL. However, I still recommend my double-loop technique over
a. Number of hidden nodes in outer loop
b. Random data division AND weight initialization in inner loop.
Hope this helps.
Greg

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!