-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature[next]: Improve CollapseTuple pass #1350
feature[next]: Improve CollapseTuple pass #1350
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just some initial comments, didn't look at the details of the actual pass as you wanted to add more comments.
Move `gt4py.next.iterator.ir_makers` and `gt4py.next.iterator.transforms.common_pattern_matcher` into a new module named `gt4py.next.iterator.ir_utils`. Just a small refactoring in preparation of #1350.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Small change to conform to upcoming error message guidelines.
Better solution is to extend the ITIR pass to unpack tuple args (awaiting GridTools#1350)
cscs-ci run |
|
||
def is_provable_equal(a: itir.Expr, b: itir.Expr): | ||
""" | ||
Return true if, bot not only if, two expression (with equal scope) have the same value. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand the "bot not only if", even after bot->but
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a sufficient, but not a necessary condition. I think the formulation is correct as Return true if and only if
certainly is, but I was already unhappy with it when I wrote it. I take any suggestion. I also renamed the function to is_equal
, I vaguely remember we had a discussion in person about it.
if transformation == self.Flag.REMOVE_LETIFIED_MAKE_TUPLE_ELEMENTS: | ||
continue |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you get rid of this? If it's not part of the default set, does it have to be accessible in the same pass?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was part of the "I don't really like the structure here". The REMOVE_LETIFIED_MAKE_TUPLE_ELEMENTS
is a post-processing step. I made it an argument like ignore_tuple_size
. This is not super convenient for the tests, but ok with me.
tests/next_tests/unit_tests/iterator_tests/transforms_tests/test_collapse_tuple.py
Outdated
Show resolved
Hide resolved
# TODO: fuse with test_propagate_to_if_on_tuples_with_let | ||
testee = im.let("val", im.if_("cond", im.make_tuple(1, 2), im.make_tuple(3, 4)))( | ||
im.tuple_get(0, "val") | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you have enough coverage to test the fp_transform?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No and this is rather hard to test as it is in the end just an optimization to
- keep the number of iterators in the
pass_manager
low - keep the number of visits of the tree in this pass low
In other words regardless of how often fp_transform
runs (once is enough) the result of the pass_manager
is always the same. I'm open for suggestions.
Significantly improves the collapse tuple pass preparing wider support for
if
andlet
statements.EDIT
Initially the tests for
if
statements were re-enabled for the GTFN backend in this PR. This is now postponed to #1449.