How do you implement k-fold and leave-one-out cross-validation in Python?
If you are using Python for machine learning, you can use the scikit-learn library to implement k-fold and leave-one-out cross-validation easily. Here are some code snippets to show you how:
# Import the necessary modules
from sklearn.model_selection import KFold, LeaveOneOut, cross_val_score
from sklearn.linear_model import LogisticRegression
from sklearn.datasets import load_iris
X, y = load_iris(return_X_y=True)
# Create a logistic regression model
model = LogisticRegression()
# Create a k-fold cross-validator with 5 folds
kf = KFold(n_splits=5, shuffle=True, random_state=42)
# Evaluate the model using k-fold cross-validation and get the mean score
score_kf = cross_val_score(model, X, y, cv=kf).mean()
print(f"Score with k-fold cross-validation: {score_kf:.3f}")
# Create a leave-one-out cross-validator
# Evaluate the model using leave-one-out cross-validation and get the mean score
score_loo = cross_val_score(model, X, y, cv=loo).mean()
print(f"Score with leave-one-out cross-validation: {score_loo:.3f}")