Theano code for the Penn Treebank language model and Temporal Order experiments in the paper Recurrent Dropout without Memory Loss.
Theano is required for running the experiments:
pip install Theano
First, run makedata.sh. Then select model to run in config.py and run main.py.
To run the baseline models:
python -u main.py
To run the models with 0.25 per-step dropout in hidden states:
python -u main.py --hid_dropout_rate 0.25 --per_step
To run the models with 0.5 per-sequence dropout in hidden state updates:
python -u main.py --hid_dropout_rate 0.5 --drop_candidates
To run the baseline LSTM without dropout:
python -u order.py
To run the models with 0.5 per-step dropout in hidden state updates on 30 symbol long sequences:
python -u order.py --hid_dropout_rate 0.5 --drop_candidates --per_step --low 10 --high 20 --length 30