We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
In saic_depth_completion\metrics\absolute.py, class LogDepthL1loss, Line 35,
saic_depth_completion\metrics\absolute.py
LogDepthL1loss
diff = torch.abs(torch.log(gt[mask]) - pred[mask])
Is there something mistake? Why use |log(gt)-pred|? I think the code might be
diff = torch.abs(torch.log(gt[mask])-torch.log(pred[mask]))
Is this correct?
The text was updated successfully, but these errors were encountered:
Hi @eecoder-dyf,
The code is correct as-is, since the model is already tasked with predicting the logarithmic depth (see dm_lrn.py#L28).
Therefore, we only have to apply the logarithm to the ground truth depth to directly compute the L1 loss.
Sorry, something went wrong.
No branches or pull requests
In
saic_depth_completion\metrics\absolute.py
, classLogDepthL1loss
, Line 35,Is there something mistake? Why use |log(gt)-pred|?
I think the code might be
Is this correct?
The text was updated successfully, but these errors were encountered: