Translate

RSS Feeds

Visitors

mod_vvisit_counterToday314
mod_vvisit_counterThis week314
mod_vvisit_counterAll1377100

Designed by:
SiteGround web hosting Joomla Templates
Multi-GPU back-propagation through time

 

We have implemented a multi-GPU version of the back-propagation through time algorithm for the training of MTRNN (multiple time-scales recurrent neural network). We have also tested the code on 4xGPU setups and the preliminary result show excellent scaling. From the graph you can see that when we use 2 GPUs we get almost perfect 2x speedup over single GPU; when we use 3 GPUs we get nearly 3x speed, etc.

We are now starting to develop scalable genetic algorithms for large genotypes making good use of GPU-based fitness evaluations. 

All these developments will be added to Aquila 2.0. New developers are welcome, contact me if you are interested.

 

CPU vs GPU