Denoising
The process of removing noise from data to recover the underlying clean signal. In generative AI, denoising is the core mechanism of diffusion models.
Why It Matters
Denoising is central to both data cleaning and image generation. Diffusion models learn to denoise random noise into coherent images step by step.
Example
A diffusion model starting with pure random noise and progressively removing noise over 50 steps until a clear, detailed image matching the text prompt emerges.
Think of it like...
Like restoring an old photograph — carefully removing scratches, stains, and fading to reveal the original image underneath.
Related Terms
Diffusion Model
A type of generative AI model that creates data by starting with random noise and gradually removing it, step by step, until a coherent output (like an image) emerges. This process is called denoising.
Noise
Random variation or errors in data that do not represent true underlying patterns. In deep learning, noise can also refer to the random input used in generative models.
Data Preprocessing
The process of cleaning, transforming, and organizing raw data into a format suitable for machine learning. This includes handling missing values, encoding categories, scaling features, and removing outliers.
Autoencoder
A neural network that learns to compress data into a lower-dimensional representation (encoding) and then reconstruct it back (decoding). It learns what features are most important for faithful reconstruction.