-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Predicting using MLJ interface is slow #123
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #123 +/- ##
==========================================
- Coverage 96.65% 96.48% -0.17%
==========================================
Files 22 22
Lines 658 655 -3
==========================================
- Hits 636 632 -4
- Misses 22 23 +1 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nvm this message , i was wrong
Yes, I've done that |
btw i was sure that laplace accepted vectors as input. has anything been changed or i was wrong the whole time? the documentation says abstractarray but i distinctly remember this requirement.... |
hmm not sure where you say this https://juliatrustworthyai.github.io/LaplaceRedux.jl/stable/reference/#LaplaceRedux.predict-Tuple{LaplaceRedux.AbstractLaplace,%20AbstractArray} |
i know i know, must have been something that i have misunderstood in the first days, mah |
@pasq-cat I think we should indeed just go with the direct MLJ interface #121 but for now I would still merge this cause it addresses the compute times.