Skip to content

Supervised ML: Regression Algorithms

Meghavarshini Krishnaswamy edited this page Feb 28, 2024 · 2 revisions

Supervised Machine Learning Algorithms

Scikit-learn: Regression Algorithms.

Scikit-learn is a very popular open-source machine-learning library based on Python. It supports a variety of supervised learning (regression and classification) and unsupervised learning models.

Image credits: Jorge Leonel. Medium

One list of 10 regression algorithms from Scikit-Learn, that we need to consider.

  1. Linear Regression(Ordinary Least Squares)
from sklearn.linear_model import LinearRegression
  1. Stochastic Gradient Descent Regressor
from sklearn.linear_model import SGDRegressor
  1. Polynomial Regression
from sklearn.preprocessing import PolynomialFeatures
  1. Ridge Regression
from sklearn.linear_model import Ridge
  1. Lasso Regression
from sklearn import.linear_model import Lasso
  1. Elastic Net
from sklearn.linear_model import ElasticNet
  1. Logistic Regression
from sklearn.linear_model import LogisticRegression
  1. Support Vector Machine Regression
from sklearn.svm import SVR
  1. Decision Tree Learning
from sklearn.tree import DecisionTreeRegressor()
  1. Random Decision Forest
from sklearn.ensemble import RandomForestRegressor
  1. KNN (k-nearest neighbor)
from sklearn.neighbors import KNeighborsRegressor

Regression. Regression is predicting a continuous-valued attribute associated with an object. Some examples of Regression algorithms included in Scikit-Learn:


Supervised Learning Evaluation

Regression metrics:

Sklearn.metrics

The scikit-learn library includes a large collection of regression metrics that can be used for measuring the performance of the algorithms.


Jupyter Notebook Example

Please open this Notebook in Google Colab.


Regression Algorithms in R

An example R script for performing a linear regression (ordinary least squares):


# Load the dataset
data <- read.csv("dataset.csv")

# View the first few rows of the dataset
head(data)

# Perform a linear regression
fit <- lm(dependent_variable ~ independent_variable, data=data)

# View the summary of the regression results
summary(fit)

# Calculate R-squared
rsq <- summary(fit)$r.squared

# Calculate MSE
predicted_values <- predict(fit, data)
mse <- mean((predicted_values - data$dependent_variable)^2)

# Calculate RMSE
rmse <- sqrt(mse)

# Print the metrics
cat("R-squared:", round(rsq, 3), "\n")
cat("MSE:", round(mse, 3), "\n")
cat("RMSE:", round(rmse, 3), "\n")

# Plot the regression line
plot(data$independent_variable, data$dependent_variable, main="Linear Regression", xlab="Independent Variable", ylab="Dependent Variable")
abline(fit, col="red")

An example R script for performing a polynomial regression of order n=2:

# Load the dataset
data <- read.csv("dataset.csv")

# View the first few rows of the dataset
head(data)

# Perform a polynomial regression
fit <- lm(dependent_variable ~ poly(independent_variable, 2), data=data)

# View the summary of the regression results
summary(fit)

# Plot the regression line
plot(data$independent_variable, data$dependent_variable, main="Polynomial Regression", xlab="Independent Variable", ylab="Dependent Variable")
lines(data$independent_variable, predict(fit), col="red")


References


Created: 02/22/2023 (C. Lizárraga); Last update: 01/19/2024 (C. Lizárraga)

CC BY-NC-SA