Machine Learning

Elastic Weight Consolidation

A technique for continual learning that identifies which weights are important for previously learned tasks and penalizes changes to those weights during new learning.

Why It Matters

EWC is one of the most effective approaches to overcoming catastrophic forgetting, enabling models to learn new tasks without losing old abilities.

Example

Training a model on task B while adding a penalty that prevents the most important weights for task A from changing significantly.

Think of it like...

Like protecting the load-bearing walls when renovating a house — you can change everything else, but the critical structural elements must remain intact.

Related Terms