Skip to content

Example of the implementation of Adam Gradient Descent optimization algorithm on a simple convex function.

License

Notifications You must be signed in to change notification settings

rokkian/adam_implementation

Repository files navigation

Adam Gradient Descent implementation

In these short codes I implemented an example of the Adam gradient descent optimization algorithm. The optimization is done on a simple convex cost function f(x,y) = x^2 + y^2.

Usage

  1. python3 space_2d.py -> Plot the mathematical space in 2D
  2. python3 space_3d.py -> Plot the mathematical space in 3D
  3. python3 adam_implementation -> Run the Adam Gradient Optimization over the function


    plot
    plot
    plot

Credits

  1. https://machinelearningmastery.com/adam-optimization-from-scratch/, Code Adam Optimization Algorithm From Scratch, Jason Brownlee, 2021
  2. https://arxiv.org/abs/1412.6980, Adam: A Method for Stochastic Optimization, Diederik P. Kingma, Jimmy Ba, 2014

About

Example of the implementation of Adam Gradient Descent optimization algorithm on a simple convex function.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages