Neural Networks fundamentals 15 questions 25 min

Neural Networks MCQ · test your knowledge

From perceptron to backpropagation – 15 questions covering architecture, activation, optimisation & regularisation.

Easy: 5 Medium: 6 Hard: 4
Perceptron
MLP
Backprop
Activations

Neural networks: essential building blocks

Artificial neural networks are computing systems vaguely inspired by biological brains. They consist of interconnected units (neurons) that process information using connectionist approaches. This MCQ test covers the foundational elements every deep learning practitioner must know.

What is a neural network?

A neural network is a series of algorithms that endeavors to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates. At its core, it comprises layers of neurons: input, hidden, and output.

Core concepts tested

Perceptron

The simplest form of a neural network, used for binary classification. It computes a weighted sum of inputs and applies a step function. The perceptron convergence theorem is fundamental.

Activation functions

Introduce non‑linearity: Sigmoid (0,1), Tanh (-1,1), ReLU max(0,x). Without them, stacked layers would be equivalent to a single linear transform.

Backpropagation

Algorithm to train neural networks using the chain rule. It computes gradients of the loss w.r.t each weight, then updates via gradient descent.

Weight initialisation

Proper initialisation (Xavier, He) prevents vanishing/exploding gradients. Bad init can stall training.

Regularisation

Dropout, L1/L2, early stopping – techniques to reduce overfitting in overparameterised networks.

Loss functions

MSE for regression, cross‑entropy for classification. The choice depends on the task and output layer activation.

# Simple perceptron update (Python pseudo)
if prediction != target:
    for i in range(n_inputs):
        weights[i] += learning_rate * target * inputs[i]
NN interview prep: Be ready to explain vanishing gradient, dying ReLU, and why we need non‑linearities. This MCQ covers exactly those.

Common interview questions

  • What is the difference between a perceptron and a logistic regression model?
  • Why is ReLU non‑linear and why is it preferred over sigmoid?
  • How does backpropagation use the chain rule?
  • What is the role of the bias term in a neuron?
  • Explain the vanishing gradient problem and solutions.