Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

losses are always nan #12

Closed
m-ali-awan opened this issue Jan 1, 2022 · 7 comments
Closed

losses are always nan #12

m-ali-awan opened this issue Jan 1, 2022 · 7 comments

Comments

@m-ali-awan
Copy link

Hi, hope you are fine.
Thanks for this wonderful work.
I tried training with MSL, and SMD, and my losses are always nan.
Moreover, I also tried GDN repo, and I found that there is a difference in MSL data as compared to this repo.
Thanks for any help.

Regards,
Ali

@m-ali-awan
Copy link
Author

I have further dived into code and came to know that outputs from TemporalAttentionLayer are nan always, and from FeatureAttentionLayer some come out as nans.

@axeloh
Copy link
Collaborator

axeloh commented Mar 7, 2022

If you haven't already, I suggest you ensure that

  1. input data has the correct format
  2. data itself does not contain nan and are scaled properly before input to the model.

Regards,
Axel

@ylic204
Copy link

ylic204 commented Apr 10, 2022

I have further dived into code and came to know that outputs from TemporalAttentionLayer are nan always, and from FeatureAttentionLayer some come out as nans.

Have you solved this problem yet? I also have this problem, During training, the losses are all nan,and there are many zeros in the data. And I don't understand why the num_values in labeled_anomalies.csv and the shape in the .npy file in the train folder are different. For example, C-1 in labeled_anomalies.csv is 2264, but C-1.npy is 2158. '2264' and '2158' don't match.

@JinYang88
Copy link

Same problem with the SMD dataset, using the default hyperparameters.

@JinYang88
Copy link

Setting use_gatv2 to False can produce normal loss.

@ghost
Copy link

ghost commented May 6, 2022

Check out my answer on #13, I believe it is due to uninitialized bias parameters.

@JinYang88
Copy link

JinYang88 commented May 6, 2022

Check out my answer on #13, I believe it is due to uninitialized bias parameters.

It works for me now, great!

@axeloh axeloh closed this as completed May 7, 2022
@axeloh axeloh pinned this issue May 7, 2022
JinYang88 pushed a commit to JinYang88/mtad-gat-pytorch that referenced this issue Dec 17, 2023
# This is the 1st commit message:

feat: possibility to specify target dim

# The commit message ML4ITS#2 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#3 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#4 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#5 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#6 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#7 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#8 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#9 will be skipped:

# feat: possibility to specify target dim

# The commit message ML4ITS#10 will be skipped:

# fix

# The commit message ML4ITS#11 will be skipped:

# fix

# The commit message ML4ITS#12 will be skipped:

# feat: writing results to txt file

# The commit message ML4ITS#13 will be skipped:

# feat: writing results to txt file

# The commit message ML4ITS#14 will be skipped:

# ..

# The commit message ML4ITS#15 will be skipped:

# ..

# The commit message ML4ITS#16 will be skipped:

# ..

# The commit message ML4ITS#17 will be skipped:

# trying new anomaly score

# The commit message ML4ITS#18 will be skipped:

# trying new anomaly score

# The commit message ML4ITS#19 will be skipped:

# trying new anomaly score

# The commit message ML4ITS#20 will be skipped:

# trying new anomaly score

# The commit message ML4ITS#21 will be skipped:

# fix

# The commit message ML4ITS#22 will be skipped:

# results from all experiments, plotting, ++

# The commit message ML4ITS#23 will be skipped:

# added plotter class and jupyter notebook file to visualize results
JinYang88 pushed a commit to JinYang88/mtad-gat-pytorch that referenced this issue Dec 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants