6

K Fold Cross-Validation & it's Implementation | Machine Learning

 1 year ago
source link: https://www.geeksforgeeks.org/videos/k-fold-cross-validation-its-implementation/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

K Fold Cross-Validation & it's Implementation

K Fold Cross-Validation & it's Implementation | Machine Learning
Hello everyone, welcome to the session.
  • 13/06/2022

In this video, we are going to see how to implement Random forest with K-fold cross-validation. Random Forest is an ensemble technique capable of performing both regression and classification tasks with the use of multiple decision trees and a technique called Bootstrap and Aggregation, commonly known as bagging.

The K-fold cross-validation technique is a technique of resampling the data set in order to evaluate a machine learning model. In this technique, the parameter K refers to the number of different subsets that the given data set is to be split into. Further, K-1 subsets are used to train the model and the left-out subsets are used as a validation set.

To get the K-fold cross-validation we will split the dataset into three sets, Training, Testing, and Validation, with the challenge of the volume of the data.

1) Training- Here is the data being trained for the machine learning model to train the model and evaluate it on the validation set or test set

2) Testing- The machine learning Model will testing with an algorithm to Calculate prediction error

3) Validation- Generate overall prediction error by taking the average of prediction errors in every case.

Cross-Validation in Machine Learning: https://www.geeksforgeeks.org/cross-validation-machine-learning/


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK