Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weights not used in training batch #14

Open
hugokitano opened this issue Mar 4, 2020 · 3 comments
Open

Weights not used in training batch #14

hugokitano opened this issue Mar 4, 2020 · 3 comments

Comments

@hugokitano
Copy link

Hi! Thanks for this project. I noticed the weights for each training batch are not used. How did you incorporate them in training? I read in your report that your mean AUC ended up being better with them.

@thtang
Copy link
Owner

thtang commented Mar 4, 2020

What do you mean weights for each training batch ?

@hugokitano
Copy link
Author

image

In train.py, you define weights_sub, but they are not used for anything in the loss function.

@thtang
Copy link
Owner

thtang commented Mar 5, 2020

Yup, the weights are originally used for loss function as described in the paper. However, there is no improvement regarding AUC score but makes the training code messy. If you've read different versions of the paper on arXiv, you will find out that they obtained the definitely same AUC number using different loss functions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants