Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A problem when training on IEMOCAP dataset #3

Open
zbxytx opened this issue Feb 18, 2021 · 4 comments
Open

A problem when training on IEMOCAP dataset #3

zbxytx opened this issue Feb 18, 2021 · 4 comments

Comments

@zbxytx
Copy link

zbxytx commented Feb 18, 2021

1cfb19d4fdceb9122b5bc3e31bea99e

While training DialogXL on IEMOCAP dataset, the accuracy and f-score didn't increase. I found that some people faced the same problem, so can you check the files or codes used in the training stage on IEMOCAP?

(By the way, while trainig on MELD dataset, the model works correctly)

@Digimonseeker
Copy link
Collaborator

Sorry for this mistake,we have updated the IEMOCAP files.

@zbxytx
Copy link
Author

zbxytx commented Mar 10, 2021

With the new files you provided, the problem isn't be fixed. Can you reproduce the results with the existing files?

@shenwzh3
Copy link
Owner

Hi, you may check out our new version of code, it is now compatible with the Transformer v4.3.3. We have tested the code locally, the hyper-parameters worked well in this version.

@so-hyeun
Copy link

The same phenomenon occurs when running the train IEMOCAP using the newly uploaded version and Transformer v4.3.3.
Is there any updated code besides the readme?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants