Calculus for DS

Calculus Interview Q&A for Data Science

Fast revision of derivatives, gradients, and optimization concepts used in ML.

1Why is calculus important in Machine Learning?easy
Answer: Calculus helps us optimize model parameters by measuring how loss changes with respect to each parameter.
2What is a derivative?easy
Answer: A derivative gives the instantaneous rate of change (slope) of a function at a point.
3What is partial derivative?easy
Answer: It measures change in a multivariable function with respect to one variable while others are fixed.
4What is a gradient vector?medium
Answer: The gradient is a vector of partial derivatives pointing to the direction of steepest increase.
5How is gradient descent connected to calculus?medium
Answer: Gradient descent uses derivatives to move parameters in the opposite direction of gradient and reduce loss.
6What is the chain rule?easy
Answer: For nested functions, derivative of outer and inner are multiplied; this powers backpropagation in neural networks.
7What are critical points?medium
Answer: Points where derivative is zero or undefined; they can be minima, maxima, or saddle points.
8What does second derivative indicate?medium
Answer: Curvature. Positive second derivative indicates local convexity; negative indicates concavity.
9Why are convex functions easier to optimize?medium
Answer: Any local minimum is a global minimum, so optimization is more stable and predictable.
10What is learning rate in optimization?easy
Answer: It is the step size in parameter updates; too high diverges, too low learns slowly.
11What is vanishing gradient problem?hard
Answer: Gradients become extremely small through deep layers, slowing or stopping learning in early layers.
12What is exploding gradient problem?hard
Answer: Gradients become very large, causing unstable updates and numerical overflow.
13How does regularization relate to calculus updates?medium
Answer: Regularization adds penalty terms to loss, changing derivatives so weights are constrained during optimization.
14Why do we normalize features for gradient-based models?medium
Answer: Normalization improves conditioning of loss surface, helping gradient descent converge faster.
15One-line calculus summary for DS interviews?easy
Answer: Calculus gives the optimization mechanics that let ML models learn from data efficiently.