Our results achieved
- Plug-and-Play Implementation of SEVEN
- Experiment: SEVENpre on ImageNet
- Experiment: SEVENpre on Cifar
- Experiment: SEVENdyn on GLUE
You can clone this repo and install it locally.
git clone https://github.com/xiaojinying/SEVEN.git
datasets>=2.18.0
easydict>=1.11
transformers>=4.35.2
To run the SST-2 example with SEVEN, run the following:
python train.py --dataset sst2 --alpha_1 0.8 --alpha_2 0.8 --learning_rate 2e-5 --epoch 10 --batchsize 32 --pruning_algo SEVEN --target_ratio 0.6
@article{xiao2024seven,
title={SEVEN: Pruning Transformer Model by Reserving Sentinels},
author={Xiao, Jinying and Li, Ping and Nie, Jie and Tang, Zhe},
journal={arXiv preprint arXiv:2403.12688},
year={2024}
}