Machine Learning

GRU

Gated Recurrent Unit — a simplified version of LSTM that uses fewer gates and parameters while achieving similar performance on many sequence tasks. It is faster to train than LSTM.

Why It Matters

GRUs offer a practical alternative to LSTMs with simpler architecture and faster training, making them a good default choice for sequence modeling when transformers are overkill.

Example

A GRU processing stock price sequences to predict next-day movements, using its update and reset gates to determine what historical information is relevant.

Think of it like...

Like a simplified version of a filing system — instead of three separate bins for keeping, adding, and sharing, it combines some functions for efficiency.

Related Terms