You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
if __name__=='__main__':
net = generate_model('S',).cuda()
#print(net)
from torchsummary import summary
inputs = torch.rand(8, 3, 10, 112, 112).cuda()
output = net(inputs)
print(output.shape)
summary(net,input_size=(3,10,112,112),batch_size=8,device='cuda')
The code can run success, but except the summary,
The error report was
File "x3d.py", line 382, in <module>
summary(net,input_size=(3,10,112,112),batch_size=8,device='cuda')
File "D:\software\program\Anaconda3\envs\pytorch1\lib\site-packages\torchsummary\torchsummary.py", line 72, in summary
model(*x)
File "D:\software\program\Anaconda3\envs\pytorch1\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
return forward_call(*input, **kwargs)
File "x3d.py", line 324, in forward
x = self.bn1(x)
File "D:\software\program\Anaconda3\envs\pytorch1\lib\site-packages\torch\nn\modules\module.py", line 1128, in _call_impl
result = forward_call(*input, **kwargs)
File "x3d.py", line 52, in forward
x = x.view(n // self.num_splits, c * self.num_splits, t, h, w)
RuntimeError: shape '[0, 192, 10, 56, 56]' is invalid for input of size 1505280
I found that the shape of x was (2,3,10,112,112) in the forwad other than (8,3,10,112,112), and I don`t konw why.
Do you konw that?
The text was updated successfully, but these errors were encountered:
def forward(self, x):
if self.training:
n, c, t, h, w = x.shape
x = x.view(n // self.num_splits, c * self.num_splits, t, h, w)
x = self.split_bn(x)
x = x.view(n, c, t, h, w)
else:
x = self.bn(x)
if self.affine:
x = x * self.weight.view((-1, 1, 1, 1))
x = x + self.bias.view((-1, 1, 1, 1))
return x
Add these codes to the file
The code can run success, but except the summary,
The error report was
I found that the shape of x was (2,3,10,112,112) in the forwad other than (8,3,10,112,112), and I don`t konw why.
Do you konw that?
The text was updated successfully, but these errors were encountered: