Distillation Can Make AI Models Smaller and Cheaper
👁
23 views
📅
Published 1 month ago
⏱
1 min read
A fundamental technique lets researchers use a big, expensive model to train another model for less.