Group Distributionally Robust Dataset Distillation with Risk
Minimization
Group Distributionally Robust Dataset Distillation with Risk
Minimization
Dataset distillation (DD) has emerged as a widely adopted technique for crafting a synthetic dataset that captures the essential information of a training dataset, facilitating the training of accurate neural models. Its applications span various domains, including transfer learning, federated learning, and neural architecture search. The most popular methods for …