Homomorphic Encryption
A form of encryption that allows computation on encrypted data without decrypting it first. The results, when decrypted, match what would have been computed on the plaintext.
Why It Matters
Homomorphic encryption is the holy grail of privacy-preserving computation — AI can process your data without ever seeing it in plain form.
Example
A hospital sending encrypted patient data to a cloud AI service that runs predictions without ever decrypting the data — results are returned encrypted and only the hospital can decrypt.
Think of it like...
Like a lockbox where you can manipulate objects inside through gloves attached to the box — you work with the contents without ever opening the box.
Related Terms
Privacy-Preserving ML
Machine learning techniques that train models or make predictions while protecting the privacy of individual data points. Includes federated learning, differential privacy, and homomorphic encryption.
Data Privacy
The right of individuals to control how their personal information is collected, used, stored, and shared. In AI, data privacy concerns arise from training data, user interactions, and model outputs.
Federated Learning
A decentralized training approach where a model is trained across multiple devices or organizations without sharing raw data. Each participant trains locally and only shares model updates.