Probleme de tracer l'évolution de l'erreur quadratique moyenne des algorithmes d'apprentissage automatique.

bonjour, svp comment faire pour tracer l'evolution de l'erreur quadratique moyenne pour les diffrents algorithmes d'apprentissage automatique tels que :LASSO,ELASTICNET, BAGTREE,BOOST,SVR, ELM, KRR et GPR.voila ce que j'ai fait mais c'est nul je ne comprend pas.
% Import the CSV data
X_train = readtable('traindata.csv');
% Display the names of the variables in the dataset
variableNames = X_train.Properties.VariableNames;
fprintf('The names of the variables in the dataset are:\n');
for i = 1:length(variableNames)
fprintf('%s\n', variableNames{i});
end
who;
% Get the values of the `I1` variable
I1=X_train.I1;
I2=X_train.I2;
I3=X_train.I3;
Vdcmean1=X_train.Vdcmean1;
Pdcmean1=X_train.Pdcmean1;
% Select the `I1`, `I2`, and `i3` variables
I1I2I3= X_train(:, {'I1', 'I2', 'I3' 'Pdcmean1'});
% Créez un tableau contenant les noms des variables à utiliser pour les algorithmes de processus gaussiens
variable_names = {'I1', 'I2', 'I3', 'Pdcmean1'};
% Créez un tableau contenant les données à utiliser pour les algorithmes de processus gaussiens
X_train = X_train(:, variable_names);
% Créez un vecteur contenant les labels de sortie
labels = X_train.Pdcmean1;
% Itérez sur les différents algorithmes de processus gaussiens
function_mean=@(x)0;
function_cov=@(x,x_prime)exp-(x-x_prime)*(x-x_prime)/0.1;
x=X_train;
gp=GaussienProcessRegressor(function_mean, function_cov);
algorithm_name = {'GaussianProcessRegression', 'GaussianProcessClassification'};
gp = GaussianProcessRegressor(function_mean, function_cov);
% Créez un objet algorithme
algorithm = feval(algorithm_name, data, labels);
% Entraînez l'algorithme
[algorithm, train_rmse] = train(algorithm, data);
% Tracez l'évolution du RMSE lors de la phase de formation
figure;
plot(train_rmse);
xlabel('Étape d''entraînement');
ylabel('RMSE');
title(algorithm_name);
%Separation des variable et des classes
%variables=data(:, 1:10);
%classe =data(:, 1:3);
% Separation of the features and label
%X_train=data(:,{'I1','I2','I1MAX','I1MIN','I1VAR','I2MAX','I2MIN','I2VAR','I3','I4','I3max','I3min','I3var','I4MAX','I4MIN','I5','I6','Itotal1','Pdcmean1','class'});
% Split the data into training and test sets
X_train = data(1:600, 1:30);
%y_train = data(1:600, 31);
X_test = data(101:end, 1:2);
%y_test = data(101:end, 3);
% Create a list of regression models
%Convert categorical variables to dummy variables
% Train the GPR model
model_gpr = fitrgp(X_train, y_train);
% Train the LASSO model
model_lasso = lasso(X_train, y_train);
% Train the Elastic net model
model_elastic_net = elasticnet(X_train, y_train);
% Train the Bagging model
model_bagging = bagging(X_train, y_train);
% Train the boosting model
model_boosting = boosting(X_train, y_train);
% Train the KRR model
model_krr = kernel_ridge(X_train, y_train);
% Train the SVR model
model_svr = support_vector_regression(X_train, y_train);
% Train the ELM model
model_elm = extreme_learning_machine(X_train, y_train);
% Train the RVM model
model_rvm = relevance_vector_machine(X_train, y_train);
% Initialize the RMSEs
rmses = zeros(1, length(models));
% Train the models
for i = 1:length(models)
% Train the model
model = eval(models{i});
model.fit(X_train, y_train);
% Calculate the RMSE on the training set
y_pred = model.predict(X_train);
(i, 1) = sqrt(mean((y_pred - y_train).^2));
Calculate the RMSE on the test set
= model.predict(X_test);
vrmses(i, 2) = sqrt(mean((y_pred - y_test).^2));
end
Plot the RMSEs
figure;
plot(rmses(:, 1), '-o', 'MarkerSize', 10, 'LineWidth', 2);
hold on;
plot(rmses(:, 2), '-x', 'MarkerSize', 10, 'LineWidth', 2);
legend('RMSE sur données d''entrainement', 'RMSE sur données de test');
xlabel('etape d''entrainement');
ylabel('RMSE');

Answers (0)

Asked:

on 12 Oct 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!