Skip to content

Commit

Permalink
oops
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Nov 21, 2024
1 parent 771341a commit 16bdbcd
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions adam_atan2_pytorch/adopt.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,15 +98,15 @@ def step(
next_m = grad.div(v.sqrt().clamp(min = eps)) # they claim that a max(value, eps) performs better than adding the epsilon

if steps > 1:
m.lerp_(next_m, 1. - beta2)
m.lerp_(next_m, 1. - beta1)

# then update parameters

p.add_(m, alpha = -lr)

# update exp grad sq (v)

v.lerp_(grad_sq, 1. - beta1)
v.lerp_(grad_sq, 1. - beta2)

# increment steps

Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "adam-atan2-pytorch"
version = "0.1.2"
version = "0.1.4"
description = "Adam-atan2 for Pytorch"
authors = [
{ name = "Phil Wang", email = "[email protected]" }
Expand Down

0 comments on commit 16bdbcd

Please sign in to comment.