This is a TensorFlow implementation of Ako (Ako: Decentralised Deep Learning with Partial Gradient Exchange). You can train any DNNs in a decentralized manner without parameter servers. Workers exchange partitioned gradients directly with each other without help of parameter servers and update their own local weights. Please refer the original paper Ako or our project home for more details.
-
Environments
- ubuntu 16.04
- Python 2.7
- Tensorflow 1.4
-
Prerequisites
- redis-server & redis-client
- tflearn (only for loading CIFAR10 dataset)
$ sudo apt-get update $ sudo apt-get install redis-server -y $ sudo pip install redis $ sudo pip install tflearn
- Build your model in redis_ako_model.py
- Write your session and load your dataset in redis_ako.py
- Change your configurations in redis_ako_config.py
- Basic configurations: Cluster IP/Port, Redis port, Synchronous training, Training epochs, Batch size, Number of batches
- Ways to train models: training a few iterations, training for a fixed time, training until a fixed accuracy
- Ako specific configurations: P values, partition details, SSP interation bound, Number of queue threads
- Execute it
# When 3 workers are clustered and used for decentralized DL # At worker 0 $ python redis_ako.py wk 0 # At worker 1 $ python redis_ako.py wk 1 # At worker 2 $ python redis_ako.py wk 2