Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some Questions About Clone Detection experiments #5

Open
fly1ngpengu1ns opened this issue Aug 19, 2024 · 0 comments
Open

Some Questions About Clone Detection experiments #5

fly1ngpengu1ns opened this issue Aug 19, 2024 · 0 comments

Comments

@fly1ngpengu1ns
Copy link

Hi,
I was recently researching this work and I am very interested in it. However, I noticed that in the code clone detection experiments, the paper mentioned that except for GrapCodeBert's experiments in POJ-104, you directly taken the results from CodeXGLUE

However, as we know from microsoft/CodeXGLUE#63, the authors of CodeXGLUE found a problem with the code in which they originally calculated the results, and corrected it in their latest paper. But the results quoted in ContraBert are the ones before the update

So I would like to know:

  1. Did you use the correct calculation method to get the correct results when you experimented with GraphCodeBert?
  2. I would like to know if you processed the annotated information in the dataset and whether the metrics you got were Eval MAP or Test MAP, because I got the following results while the report MAP is 90.46
Eval MAP Test MAP comment
0.8451 0.8795 with comment
0.8463 0.8926 without comment

I am very much looking forward to your answers!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant