Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The LogDepthL1loss code #9

Open
eecoder-dyf opened this issue Nov 20, 2022 · 1 comment
Open

The LogDepthL1loss code #9

eecoder-dyf opened this issue Nov 20, 2022 · 1 comment

Comments

@eecoder-dyf
Copy link

eecoder-dyf commented Nov 20, 2022

In saic_depth_completion\metrics\absolute.py, class LogDepthL1loss, Line 35,

diff = torch.abs(torch.log(gt[mask]) - pred[mask])

Is there something mistake? Why use |log(gt)-pred|?
I think the code might be

diff = torch.abs(torch.log(gt[mask])-torch.log(pred[mask]))

Is this correct?

@DiTo97
Copy link

DiTo97 commented Jun 26, 2023

Hi @eecoder-dyf,

The code is correct as-is, since the model is already tasked with predicting the logarithmic depth (see dm_lrn.py#L28).

Therefore, we only have to apply the logarithm to the ground truth depth to directly compute the L1 loss.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants