Jack and Masters of all Trades: One-Pass Learning Sets of Model Sets From Large Pre-Trained Models
Jack and Masters of all Trades: One-Pass Learning Sets of Model Sets From Large Pre-Trained Models
For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These large pre-trained models or "Jacks of All Trades" (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments …