Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

PaulKMandal/huggingface-dino

 
 

Repository files navigation

A Huggingface Compatible Distillation with No Labels Implementation

Running main-hf.py will train a Huggingface ViT model using DINO.

If you find this useful, please consider staring this repository or citing this implementation

@misc{mandalhfdino,
	author = {Paul K. Mandal},
	title = {A Huggingface Compatible Distillation with No Labels Implementation},
	year = {2023},
	month = {December},
	note = {\url{https://github.com/PaulKMandal/huggingface-dino/}},
}

You should also cite the original implementation

@inproceedings{caron2021emerging,
  title={Emerging Properties in Self-Supervised Vision Transformers},
  author={Caron, Mathilde and Touvron, Hugo and Misra, Ishan and J\'egou, Herv\'e  and Mairal, Julien and Bojanowski, Piotr and Joulin, Armand},
  booktitle={Proceedings of the International Conference on Computer Vision (ICCV)},
  year={2021}
}

About

DINO training for huggingface ViT

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 83.6%
  • Jupyter Notebook 16.4%