Distilling Diffusion Models Into Conditional GANs
Abstract
References
Index Terms
- Distilling Diffusion Models Into Conditional GANs
Recommendations
Diffusion models beat GANs on image synthesis
NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing SystemsWe show that diffusion models can achieve image sample quality superior to the current state-of-the-art generative models. We achieve this on unconditional image synthesis by finding a better architecture through a series of ablations. For conditional ...
Quantum wasserstein GANs
NIPS'19: Proceedings of the 33rd International Conference on Neural Information Processing SystemsThe study of quantum generative models is well motivated, not only because of its importance in quantum machine learning and quantum chemistry but also because of the perspective of its implementation on near-term quantum machines. Inspired by previous ...
Unifying GANs and score-based diffusion as generative particle models
NIPS '23: Proceedings of the 37th International Conference on Neural Information Processing SystemsParticle-based deep generative models, such as gradient flows and score-based diffusion models, have recently gained traction thanks to their striking performance. Their principle of displacing particle distributions using differential equations is ...
Comments
Information & Contributors
Information
Published In
Publisher
Springer-Verlag
Berlin, Heidelberg
Publication History
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0