Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support max_unpool2d lowering #3733

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from
Draft

Conversation

jinchen62
Copy link
Collaborator

@jinchen62 jinchen62 commented Sep 25, 2024

Support torch->linalg lowering for max_unpool2d.

Fixes nod-ai/SHARK-ModelDev#718

@jinchen62 jinchen62 force-pushed the maxunpool branch 9 times, most recently from 09326ef to 13abfb6 Compare September 26, 2024 12:06
@jinchen62 jinchen62 changed the title Generalize max_unpool lowering Support max_unpool2d lowering Sep 26, 2024
Copy link
Collaborator

@vivekkhandelwal1 vivekkhandelwal1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @jinchen62, the PR says that it support MaxUnpool2d lowering, but I see that it has been removed from some of the files. Also, to verify this PR, please add some e2e tests so that we can test the support added for the op. Please correct me, if I have quoted something wrong.

@jinchen62
Copy link
Collaborator Author

@vivekkhandelwal1 It got lowered to torch.aten.max_unpool3d, so it could be lowered to linalg. I added the lit tests for both onnx->torch and torch->linalg. So basically there is no big difference between 2d and 3d, it could be generalized, but I couldn't rename the op because of the pytorch upstream, I attached the related links in commit msg.

@zjgarvey
Copy link
Collaborator

zjgarvey commented Oct 2, 2024

@samutamm

@vivekkhandelwal1
Copy link
Collaborator

@vivekkhandelwal1 It got lowered to torch.aten.max_unpool3d, so it could be lowered to linalg. I added the lit tests for both onnx->torch and torch->linalg. So basically there is no big difference between 2d and 3d, it could be generalized, but I couldn't rename the op because of the pytorch upstream, I attached the related links in commit msg.

Do you mean to say that you're lowering the 4-d input case of Onnx.Unpool to AtenMaxUnpool3d which should have instead lowered to AtenMaxUnpool2d, and handling the 4-d input in the lowering of AtenMaxUnpool3d itself, instead of having it as a separate op?

@jinchen62
Copy link
Collaborator Author

@vivekkhandelwal1 Yes, 2D and 3D max_unpool can be generalized as one op.

@vivekkhandelwal1
Copy link
Collaborator

vivekkhandelwal1 commented Oct 4, 2024

@vivekkhandelwal1 Yes, 2D and 3D max_unpool can be generalized as one op.

That's fine but what you've done in this PR is not correct. You have added the support to handle 2d pooling case in the MaxUnpool3d op which is wrong. Ideally, you should've added the lowering for MaxUnpool2d op, and if there exists an issue related to PyTorch with that, then you can define a new op in https://github.com/llvm/torch-mlir/blob/main/include/torch-mlir/Dialect/Torch/IR/TorchOps.td (before this, we have to be sure what's the exact issue with using the TorchMaxUnpool2d op, and can that be fixed in upstream PyTorch), say TorchAtenMaxUnpoolOp, and decompose the Unpool3d and 2d op to this particular op and add the torch->linalg lowering for this op.

@jinchen62
Copy link
Collaborator Author

jinchen62 commented Oct 4, 2024

@vivekkhandelwal1 For sure I could add a separate lowering for 2D, but that would be most of duplicate codes.

@vivekkhandelwal1
Copy link
Collaborator

@vivekkhandelwal1 For sure I could add a separate lowering for 2D, but that would be most of duplicate codes. Is it okay?

No, you should not do it in a way that the code is duplicated. Instead take all the common code in a utility or templatize the lowering so that the code can be re-used.

@jinchen62
Copy link
Collaborator Author

jinchen62 commented Oct 8, 2024

@vivekkhandelwal1 Using 3D lowering is also because torch.aten.max_unpool2d misses pads and strides inputs as mentioned here nod-ai/SHARK-ModelDev#764 (comment). I wonder why we don't pass more info through torch op even kernel_shape so we don't need to calculate the kernel size here https://github.com/llvm/torch-mlir/blob/main/lib/Conversion/TorchToLinalg/Pooling.cpp#L664. Do you have any suggestions on how to lower 2D case without pads and strides?

@jinchen62 jinchen62 force-pushed the maxunpool branch 4 times, most recently from 80c2426 to 0eeff42 Compare October 9, 2024 01:44
@vivekkhandelwal1
Copy link
Collaborator

@vivekkhandelwal1 Using 3D lowering is also because torch.aten.max_unpool2d misses pads and strides inputs as mentioned here nod-ai/SHARK-ModelDev#764 (comment). I wonder why we don't pass more info through torch op even kernel_shape so we don't need to calculate the kernel size here https://github.com/llvm/torch-mlir/blob/main/lib/Conversion/TorchToLinalg/Pooling.cpp#L664. Do you have any suggestions on how to lower 2D case without pads and strides?

I think this is actually an issue with the PyTorch definition of the max_unpool2d op. The possible way to fix this is either made the fix in PyTorch upstream or define an op in TorchOps.td and use that for the lowering.

@jinchen62 jinchen62 force-pushed the maxunpool branch 2 times, most recently from 6e8caa0 to 828f1b6 Compare October 9, 2024 23:22
@jinchen62
Copy link
Collaborator Author

need to merge pytorch/pytorch#138805

@jinchen62 jinchen62 marked this pull request as draft November 20, 2024 00:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add TorchToLinalg lowering for MaxUnpool operation
3 participants