You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File ~/Desktop/code-exercises/alphafold/tutorials/attention/control_values/attention_checks.py:182, in test_module_method(module, test_name, input_names, output_names, control_folder, method, include_batched, overwrite_results)
[180](https://file+.vscode-resource.vscode-cdn.net/Users/kaelan/Desktop/code-exercises/alphafold/tutorials/attention/~/Desktop/code-exercises/alphafold/tutorials/attention/control_values/attention_checks.py:180) #print(out.shape, expected_out.shape, out, expected_out)
[181](https://file+.vscode-resource.vscode-cdn.net/Users/kaelan/Desktop/code-exercises/alphafold/tutorials/attention/~/Desktop/code-exercises/alphafold/tutorials/attention/control_values/attention_checks.py:181) print(torch.linalg.norm(out - expected_out))
--> [182](https://file+.vscode-resource.vscode-cdn.net/Users/kaelan/Desktop/code-exercises/alphafold/tutorials/attention/~/Desktop/code-exercises/alphafold/tutorials/attention/control_values/attention_checks.py:182) assert torch.allclose(out, expected_out), f'Problem with output {out_name} in test {test_name} in non-batched check.'
[184](https://file+.vscode-resource.vscode-cdn.net/Users/kaelan/Desktop/code-exercises/alphafold/tutorials/attention/~/Desktop/code-exercises/alphafold/tutorials/attention/control_values/attention_checks.py:184) if include_batched:
[185](https://file+.vscode-resource.vscode-cdn.net/Users/kaelan/Desktop/code-exercises/alphafold/tutorials/attention/~/Desktop/code-exercises/alphafold/tutorials/attention/control_values/attention_checks.py:185) for out, out_file_name, out_name in zip(batched_out, out_file_names, output_names):
AssertionError: Problem with output q_prep in test mha_prep_qkv in non-batched check.
When I print the norm of the difference of the two quantities in the torch.allclose call causing the error, it is about 7e-6, so I believe this can be fixed by simply adding a tolerance. Happy to make a PR doing that if you agree this is the issue!
The text was updated successfully, but these errors were encountered:
thanks for reporting the issue! I'm not super knowledgeable in cross-platform compatibility, do you think it's an expectable error? It seems weird to me that double computations should have so much offset, but I'm not sure how consistent PyTorch is over different platforms.
In any case, the error seems to be small enough not to lead to false positives, so I'd welcome a pull request!
I tried to run the following cell in the attention tutorial notebook:
and it throws the following error:
When I print the norm of the difference of the two quantities in the
torch.allclose
call causing the error, it is about 7e-6, so I believe this can be fixed by simply adding a tolerance. Happy to make a PR doing that if you agree this is the issue!The text was updated successfully, but these errors were encountered: