Related Machine Learning Links
Learn Neural Networks Machine Learning Tutorial, validate concepts with Neural Networks Machine Learning MCQ Questions, and prepare interviews through Neural Networks Machine Learning Interview Questions and Answers.
Neural Networks Q&A
20 Core Questions
Interview Prep
Neural Networks: Interview Q&A
Short questions and answers on neurons, layers, activations, backpropagation and key deep learning concepts.
Layers
Activations
Backprop
Overfitting
1
What is a neural network in simple terms?
âš¡ Beginner
Answer: A neural network is a function built from many simple units called neurons arranged in layers that learn to map inputs to outputs from data.
2
What is a neuron (perceptron) mathematically?
📊 Intermediate
Answer: A neuron computes a weighted sum of inputs plus bias and passes it through a non-linear activation function.
3
Why do we need non-linear activation functions?
📊 Intermediate
Answer: Without non-linearities the network would be equivalent to a single linear transformation, unable to model complex patterns.
4
Name some common activation functions.
âš¡ Beginner
Answer: Popular activations: ReLU, sigmoid, tanh, Leaky ReLU, softmax (for output probabilities).
5
What is backpropagation?
📊 Intermediate
Answer: Backpropagation is an algorithm that uses the chain rule to efficiently compute gradients of the loss with respect to all weights.
6
How are network weights updated during training?
âš¡ Beginner
Answer: Weights are updated using gradient descent or its variants, moving parameters in the direction that reduces loss.
7
What is the vanishing gradient problem?
🔥 Advanced
Answer: In deep networks, gradients can become very small as they are backpropagated, making earlier layers learn very slowly.
8
How does ReLU help with vanishing gradients?
📊 Intermediate
Answer: ReLU has a constant gradient of 1 for positive inputs, which reduces gradient shrinkage compared to sigmoid/tanh.
9
What is overfitting in neural networks and how do you reduce it?
📊 Intermediate
Answer: Overfitting means the network fits training noise. You reduce it with more data, regularization, dropout, early stopping and simpler architectures.
10
What is dropout and why is it used?
🔥 Advanced
Answer: Dropout randomly zeros out a fraction of activations during training, acting like an ensemble of subnetworks and reducing co-adaptation and overfitting.
11
What is the difference between batch, mini-batch and stochastic gradient descent?
📊 Intermediate
Answer: Batch uses all samples per update, stochastic uses one sample, mini-batch uses small batches (most common in practice).
12
Name some popular optimizers for neural networks.
âš¡ Beginner
Answer: Common optimizers: SGD with momentum, Adam, RMSProp, Adagrad.
13
What is a loss function in neural networks?
âš¡ Beginner
Answer: The loss function measures how wrong predictions are; training aims to minimize this quantity.
14
Which loss functions are common for regression and classification?
âš¡ Beginner
Answer: Regression: MSE, MAE. Classification: cross-entropy (binary or categorical).
15
What is weight initialization and why does it matter?
🔥 Advanced
Answer: Weight initialization chooses starting values for parameters; good schemes (e.g., Xavier/He) help avoid vanishing/exploding activations.
16
What is batch normalization?
🔥 Advanced
Answer: Batch norm normalizes layer inputs within a batch and learns scale/shift, which can stabilize training and allow higher learning rates.
17
What is the difference between a shallow and a deep network?
âš¡ Beginner
Answer: Shallow networks have one or few hidden layers; deep networks have many hidden layers that learn hierarchical features.
18
What are CNNs and RNNs in brief?
📊 Intermediate
Answer: CNNs specialize in spatial data like images using convolutions; RNNs/sequence models handle sequential data like text or time series.
19
When would you avoid using a neural network?
📊 Intermediate
Answer: When you have very little data, strong interpretability needs, or simple tabular problems where tree models work better.
20
What is the key message to remember about neural networks?
âš¡ Beginner
Answer: Neural networks are flexible function approximators; success depends on good data, architecture, regularization and training practices.
Quick Recap: Neural Networks
If you can explain neurons, activations, backprop and overfitting controls, you can handle most neural-network interview questions with confidence.