Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Dataset distillation is the task of synthesizing a small dataset such that models trained on it achieve high performance on the original large dataset. A ...
We provide a PyTorch implementation of Dataset Distillation. We distill the knowledge of tens of thousands of images into a few synthetic training images called ...
Dataset distillation is the task of synthesizing a small dataset such that a model trained on the synthetic set will match the test accuracy of the model ...
Dataset distillation compresses large datasets into smaller synthetic coresets which retain performance with the aim of reducing the storage and ...
Dataset Distillation concerns the synthesis of small synthetic datasets that still lead to models with good test performance. Generative Latent Distillation ...
We propose RDED, a novel computationally-efficient yet effective data distillation paradigm, to enable both diversity and realism of the distilled data.
To achieve lossless dataset distillation, an intuitive idea is to increase the size of the synthetic dataset. However, previous dataset distillation methods ...
The official implementation of the NeurIPS'22 paper: Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks.
We propose a new dataset distillation algorithm using reparameterization and convexification of implicit gradients (RCIG), that substantially improves the ...
We propose RaT-BPTT, a new algorithm for Dataset distillation, which sets SOTA across various benchmarks.