Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add xtensor support, remove Eigen #607

Merged
merged 4 commits into from
Jan 30, 2024
Merged

Add xtensor support, remove Eigen #607

merged 4 commits into from
Jan 30, 2024

Conversation

psavery
Copy link
Collaborator

@psavery psavery commented Jan 19, 2024

We have decided to use xtensor instead of Eigen for the C++ code, for a few reasons:

  1. Eigen doesn't really support tensors (3 or more dimensions) all that well, and we need tensor support. xtensor has better support.
  2. Eigen uses Fortran-ordering, whereas xtensor uses C-ordering. Numpy also uses C-ordering, so we sometimes must make deep copies when converting to Eigen, but we do not need to for xtensor.
  3. xtensor maintains xsimd, which we are already using anyways

Adding xtensor support is a great step towards accelerating other functions.

@psavery psavery marked this pull request as ready for review January 30, 2024 17:33
@psavery psavery requested a review from ZackAttack614 January 30, 2024 17:38
@psavery
Copy link
Collaborator Author

psavery commented Jan 30, 2024

@ZackAttack614 Ready for merging after you approve.

psavery and others added 4 commits January 30, 2024 16:06
We are planning to replace Eigen with xtensor. This is a first step.

It builds successfully, but the tests will not pass, because the
inverse distortion function needs to be re-written to use xtensor
instead of Eigen.

Signed-off-by: Patrick Avery <[email protected]>
This was required for the compilation with xsimd to succeed.

Signed-off-by: Patrick Avery <[email protected]>
Copy link
Collaborator

@ZackAttack614 ZackAttack614 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be good for the context of this PR. The next PR related to the transforms module will include linear algebra operations, requiring xtensor-blas to be included in the conda installation. We can exclude that for now, to make sure we keep this prerelease as clean as possible for user testing.

@psavery psavery merged commit 2dd3ab2 into master Jan 30, 2024
6 checks passed
@psavery psavery deleted the xtensor-support branch January 30, 2024 23:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants