Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

xiaojinying/SEVEN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SEVEN:pruning transformer model by reserving sentinels

IJCNN 2024 | [Paper] | [Code]

Our results achieved

image text image text

TODO List

  • Plug-and-Play Implementation of SEVEN
  • Experiment: SEVENpre on ImageNet
  • Experiment: SEVENpre on Cifar
  • Experiment: SEVENdyn on GLUE

Contents

Clone

You can clone this repo and install it locally.

git clone https://github.com/xiaojinying/SEVEN.git

Requirements

datasets>=2.18.0
easydict>=1.11
transformers>=4.35.2

Experiments

GLUE

To run the SST-2 example with SEVEN, run the following:

python train.py --dataset sst2 --alpha_1 0.8 --alpha_2 0.8 --learning_rate 2e-5 --epoch 10 --batchsize 32 --pruning_algo SEVEN --target_ratio 0.6

Citation

@article{xiao2024seven,
  title={SEVEN: Pruning Transformer Model by Reserving Sentinels},
  author={Xiao, Jinying and Li, Ping and Nie, Jie and Tang, Zhe},
  journal={arXiv preprint arXiv:2403.12688},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages