Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

ssemeniuta/drop-rnn

Repository files navigation

Recurrent Dropout without Memory Loss

Theano code for the Penn Treebank language model and Temporal Order experiments in the paper Recurrent Dropout without Memory Loss.

Requirements

Theano is required for running the experiments:

pip install Theano

Language Modeling Experiments

First, run makedata.sh. Then select model to run in config.py and run main.py.

To run the baseline models:

python -u main.py

To run the models with 0.25 per-step dropout in hidden states:

python -u main.py --hid_dropout_rate 0.25 --per_step

To run the models with 0.5 per-sequence dropout in hidden state updates:

python -u main.py --hid_dropout_rate 0.5 --drop_candidates

Temporal Order Experiments

To run the baseline LSTM without dropout:

python -u order.py

To run the models with 0.5 per-step dropout in hidden state updates on 30 symbol long sequences:

python -u order.py --hid_dropout_rate 0.5 --drop_candidates --per_step --low 10 --high 20 --length 30

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published