You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Question:
According to this line, probs = F.softmax(self.scale * probs, dim=2)# batch x k x hw
In this code, the input dimension is [batch_size, num_class, fh*fw].
And the softmax dimension is 2, which means that the summation of the dimensions of the feature map (fh*fw) is one.
However, in my opinion, I thinke the softmax dimension should be 1 to make the summation of the dimension of the num_class (num_class) is one.
The corrected code is as follows: probs = F.softmax(self.scale * probs, dim=1)# batch x num_class x hw
The text was updated successfully, but these errors were encountered:
In this line : https://github.com/HRNet/HRNet-Semantic-Segmentation/blob/HRNet-OCR/lib/models/seg_hrnet_ocr.py#L64
Question:
According to this line,
probs = F.softmax(self.scale * probs, dim=2)# batch x k x hw
In this code, the input dimension is [batch_size, num_class, fh*fw].
And the softmax dimension is 2, which means that the summation of the dimensions of the feature map (fh*fw) is one.
However, in my opinion, I thinke the softmax dimension should be 1 to make the summation of the dimension of the num_class (num_class) is one.
The corrected code is as follows:
probs = F.softmax(self.scale * probs, dim=1)# batch x num_class x hw
The text was updated successfully, but these errors were encountered: