What is Model Distillation?
AI EngineeringTraining a smaller model to replicate the behavior of a larger teacher model for efficiency.
Knowledge distillation transfers capabilities from large models to smaller, faster ones. This enables deploying powerful AI on edge devices and reducing inference costs while maintaining quality.