Linear Regression Supervised Learning
Core Algorithm scikit-learn

Linear Regression

Learn how to fit a straight line to data, interpret coefficients, and evaluate regression models using Python.

What is Linear Regression?

Linear regression models the relationship between one or more input variables (features) and a numeric output (target) using a straight line (or hyperplane).

For a single feature \( x \), the model is: \( \hat{y} = w x + b \) where:

  • \( w \) is the weight (slope).
  • \( b \) is the bias (intercept).
  • \( \hat{y} \) is the predicted value.

Simple Example: One Feature

Predict House Price from Size
import numpy as np
from sklearn.linear_model import LinearRegression

# Feature: house size (sq ft)
X = np.array([[500], [750], [1000], [1250], [1500]])

# Target: price (in thousands of dollars)
y = np.array([100, 150, 200, 250, 300])

model = LinearRegression()
model.fit(X, y)

print("Slope (w):", model.coef_[0])
print("Intercept (b):", model.intercept_)

# Predict price for new size
new_size = np.array([[1200]])
pred = model.predict(new_size)
print("Predicted price (thousands):", pred[0])

Evaluating Regression Models

MSE and R²
from sklearn.metrics import mean_squared_error, r2_score
from sklearn.model_selection import train_test_split
import numpy as np

X = np.array([[500], [750], [1000], [1250], [1500], [1750], [2000]])
y = np.array([100, 150, 200, 250, 300, 320, 350])

X_train, X_test, y_train, y_test = train_test_split(
    X, y, test_size=0.3, random_state=42
)

model = LinearRegression()
model.fit(X_train, y_train)

y_pred = model.predict(X_test)

mse = mean_squared_error(y_test, y_pred)
r2 = r2_score(y_test, y_pred)

print("MSE:", mse)
print("R² score:", r2)