Logistic Regression Classification
Core Algorithm scikit-learn

Logistic Regression

Learn how logistic regression models probabilities for binary classification, using the sigmoid function and simple scikit-learn examples.

What is Logistic Regression?

Logistic regression is used for classification, not regression, despite its name. It predicts the probability that an input belongs to a class.

For a single feature \( x \), logistic regression computes: \( p = \sigma(w x + b) \) where \( \sigma \) is the sigmoid function.

Sigmoid Function

The sigmoid function maps any real number to a value between 0 and 1: \( \sigma(z) = \frac{1}{1 + e^{-z}} \).

Visualizing Sigmoid
import numpy as np
import matplotlib.pyplot as plt

def sigmoid(z):
    return 1 / (1 + np.exp(-z))

z = np.linspace(-10, 10, 200)
plt.plot(z, sigmoid(z))
plt.title("Sigmoid Function")
plt.xlabel("z")
plt.ylabel("σ(z)")
plt.grid(True, alpha=0.3)
plt.show()

Example: Binary Classification

Logistic Regression with scikit-learn
from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import accuracy_score, classification_report

# Load binary classification dataset
data = load_breast_cancer()
X, y = data.data, data.target

X_train, X_test, y_train, y_test = train_test_split(
    X, y, test_size=0.2, random_state=42, stratify=y
)

# Scale features for better convergence
scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train)
X_test_scaled = scaler.transform(X_test)

model = LogisticRegression(max_iter=1000)
model.fit(X_train_scaled, y_train)

y_pred = model.predict(X_test_scaled)

print("Accuracy:", accuracy_score(y_test, y_pred))
print("\nClassification report:\n", classification_report(y_test, y_pred))