Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On Karpathy's Pytorch course, they found out Numerical instability in Google Colab, but no issues on local device. #3199

Open
JonathanSum opened this issue Oct 31, 2022 · 3 comments

Comments

@JonathanSum
Copy link

image
image
We tried to hand calculate the derivative and do a comparing to the Pytorch derivative and our hand calculated.
It didn't show issues on the local device, but it showed issues on the colab.

karpathy/nn-zero-to-hero#13
I got no answer from the Pytorch forum. I am thinking maybe the issue is colab itself?

@cperry-goog
Copy link

I'm guessing this is version skew between dependencies and will be very hard to debug. Is it possible to ensure your dependencies are all the same version as our base image? If so, you can share a minimal repro notebook.

@JonathanSum
Copy link
Author

I'm guessing this is version skew between dependencies and will be very hard to debug. Is it possible to ensure your dependencies are all the same version as our base image? If so, you can share a minimal repro notebook.

Colab notebook: https://colab.research.google.com/drive/1HmZ8bgtAfvyMaZyu3Sr1Bgxsj35jitTs?usp=sharing

@mco-gh
Copy link

mco-gh commented Nov 10, 2022

How about setting a fixed number of digits of precision in your maxdiff calculation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants