Viewing a cross validated classification tree in the "Classification Tree Viewer"

2 views (last 30 days)
Hi,
I am using the cross validation method 'crossval' as explained in the matlab help;
cvmodel = crossval(model) creates a partitioned model from model, a fitted classification tree. By default, crossval uses 10-fold cross validation on the training data to create cvmodel.
I have already produced the classification tree with M being a matrix of variables and A being a column vector of classification;
tree = ClassificationTree.fit(M, A);
and can view this tree using 'view'
view(tree,'mode','graph')
Previously I had split matrix M into training and sample groups however now I would like to use the cross validation method 'crossval' to produce a more accurate accuracy prediction
cvtree = crossval(tree);
I can then use kfoldLoss and kfoldPredict to determine the effectiveness of the tree however I cannot determine how to 'view' the tree in any sense. I have attempted the following;
view(cvtree)
and
view(cvtree,'mode','graph')
however always get the same error;
*__view(cvtree) Error using classreg.learning.internal.DisallowVectorOps/throwNoCatError (line 57) Concatenation of classreg.learning.partition.ClassificationPartitionedModel objects is not allowed. Use a cell array to contain multiple objects.
Error in classreg.learning.internal.DisallowVectorOps/horzcat (line 48) function a = horzcat(this,varargin), throwNoCatError(this); end
Error in view (line 66) if(~all(isnumeric([viewArgs{:}])))_*
I was of the understanding that the crossval method produces 10 trees each with 90% of the data and tests these on the remaining 10% before selecting the tree which gives the greatest accuracy. Is this correct? and if so, why can I not view this resultant tree?
I really hope someone can help as I have been trawling the internet and matlab documentation for hours with no luck!
Thanks, Katie

Accepted Answer

Shashank Prasanna
Shashank Prasanna on 16 May 2013
Katie, when you crossvalidate, the output is a ClassificationPartitionedModel, which means that it contains all the cross-validated trees and will use all of them to do prediction. All the individual trees can be accessed as follows:
>> cvtree.Trained
>> view(cvtree.Trained{1},'mode','graph')
>> view(cvtree.Trained{2},'mode','graph') % etc
All the individual losses (of course based on your loss function) can be accessed as below:
>> cvtree.kfoldLoss('mode','individual')
You can view and use an individual tree if you like. Use 'kfoldPredict' to predict from the entire crossvalidated model.
  2 Comments
Dhruv Ghulati
Dhruv Ghulati on 25 Dec 2015
I have a question on kfoldPredict. What would be the second parameter in kfoldPredict - would it be the entire predictor matrix, or a test subset e.g.
kfoldPredict(forestModel, XTest);?
Would you do the prediction on a test set that is built using
c = cvpartition(catresponse, 'HoldOut', 0.3);
or
c = cvpartition(catresponse, 'KFold', 10);

Sign in to comment.

More Answers (0)

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!