Releases: cerndb/dist-keras
Version 0.2.1
Several optimizations including improved epoch handling. Though, introduces Issue #35.
Flat Minima
This release has seen a lot of updates compared to version 0.1.0. We removed HTTP weight exchanges, and replaced it with a custom protocol based on sockets, which improves the speed drastically. Furthermore, a lot of work was done on the optimizer side, implementing several state-of-the-art distributed optimization algorithms.
Flat Minima
This release has seen a lot of updates compared to version 0.1.0. We removed HTTP weight exchanges, and replaced it with a custom protocol based on sockets, which improves the speed drastically. Furthermore, a lot of work was done on the optimizer side, implementing several state-of-the-art distributed optimization algorithms.
Parallel Beliefs
First release of Distributed Keras.
We have made some significant performance improvements compared to the first prototypes. The most significant changes were in the parameter communication protocol, which has been changed from a REST based exchange to a low-level socket implementation. Furthermore, several changes have been made to provide users with training metrics, provide model serving utilities, and additional examples, including the "Machine Learning Unit Test" MNIST.
Stale Gradient
Applied some production-critical methods.
Stale Gradient
0.0.1 Update workflow notebook