Neural Architecture Search
An automated technique for finding optimal neural network architectures by searching through a vast space of possible designs. NAS automates architecture decisions that normally require expert intuition.
Why It Matters
NAS has discovered architectures that outperform human-designed ones. It represents the automation of AI design itself — AI designing AI.
Example
NASNet, discovered by NAS algorithms, outperforming handcrafted architectures on ImageNet by automatically finding the optimal combination of layers and connections.
Think of it like...
Like having an architect who can instantly test millions of building designs and find the one that best balances strength, cost, and aesthetics.
Related Terms
AutoML
Automated Machine Learning — tools and techniques that automate the end-to-end process of applying machine learning, including feature engineering, model selection, and hyperparameter tuning.
Hyperparameter Tuning
The process of systematically searching for the best combination of hyperparameters for a model. Since hyperparameters are set before training, finding optimal values requires experimentation.
Meta-Learning
An approach where models 'learn to learn' — they are trained across many tasks so they can quickly adapt to new tasks with minimal data. Also called learning to learn.