Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing input size #1

Open
sulaimanvesal opened this issue Jul 4, 2023 · 0 comments
Open

Changing input size #1

sulaimanvesal opened this issue Jul 4, 2023 · 0 comments

Comments

@sulaimanvesal
Copy link

sulaimanvesal commented Jul 4, 2023

Hi,

I was able to reproduce the results on Brats2018 data. However, I was training the model on another dataset with different input size (20x256x256 or 32x256x256) but getting error on this part:

# Decoder
d_base_feat_size = [4, 5, 6]
# PatchExpand Output
torch.Size([1, 544, 1280]) # x.shape
8 10 12 1 544 1280 # print(D, H, W, B, L, C)
---------------------------------------------
x = x.view(B, D, H, W, C)
RuntimeError: shape '[1, 8, 10, 12, 1280]' is invalid for input of size 696320

How do you compute d_base_feat_size for an input size of (20x256x256 or 32x256x256)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant