Loss Functions & Metrics
How Neural Networks Measure "Wrongness"
What You'll Discover
Learn how neural networks know when they're wrong
MSE for Numbers
How to measure prediction errors when predicting continuous values like prices or temperatures.
Cross-Entropy for Categories
The go-to loss for classification - why it punishes confident wrong answers harshly.
Interactive Comparison
See side-by-side how different loss functions respond to the same errors.
When to Use What
A simple decision guide for picking the right loss function for your task.
Key Concepts
MSE (Mean Squared Error)
Squares errors - big mistakes get punished more
MAE (Mean Absolute Error)
Linear penalty - all errors treated fairly
Binary Cross-Entropy
For yes/no classification with probabilities
Categorical Cross-Entropy
For multi-class: pick one from many options
Loss vs Metrics
Training uses loss, evaluation uses metrics
Matching Activations
Sigmoid→BCE, Softmax→CCE, Linear→MSE
Continue Learning
See how networks minimize loss
What is a Loss Function?
A loss function is like a score card that tells your neural network how wrong it is.
Think of it like playing darts:
- •Hit the bullseye → Loss = 0 (perfect!)
- •Miss by a little → Small loss
- •Miss by a lot → Big loss
The network's job is to make the loss as small as possible.