Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the issue found in Flatten #168

Merged
merged 13 commits into from
Jan 24, 2024
Merged

Fix the issue found in Flatten #168

merged 13 commits into from
Jan 24, 2024

Conversation

kmp5VT
Copy link
Contributor

@kmp5VT kmp5VT commented Dec 22, 2023

An issue found in calling update_w_KRP without MKL see #167. To fix the problem I no longer call the flatten function. Now I just push the transpose work on the contract function. I also rewrote the Flatten code to work properly by taking an order-N tensor and flattening it to an order-3 tensor with dimensions (modes_before_n, n, modes_after_n) and then I "permute" the tensor into the flattened order during the copy stage.

Old flatten used recursion and was hard to debug. Fix it by mapping order-N tensor to order 3 tensor (before_n, n, after_n) then do transpose in copy.
@kmp5VT kmp5VT requested a review from evaleev December 22, 2023 21:08
@evaleev
Copy link
Member

evaleev commented Jan 22, 2024

@kshitij-05 have a look at the remaining failures, all seem to be ZCP (no more CP failures), indeed perhaps sitting too close to the edge?

@kmp5VT
Copy link
Contributor Author

kmp5VT commented Jan 22, 2024

@evaleev @kshitij-05 I believe I was able to fix the issue. Looks like it was a near rank problem. I just turned up the rank slightly for these problems (I increased the rank by 2)

@evaleev evaleev merged commit bef8f22 into master Jan 24, 2024
@evaleev evaleev deleted the kmp5/debug/fix_flatten branch January 24, 2024 05:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants