← News

Distillation Can Make AI Models Smaller and Cheaper

162 views 5 months ago 1 min read
A fundamental technique lets researchers use a big, expensive model to train another model for less.