-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] Progress during inference #6194
Comments
Thanks for using LightGBM, and taking the time to put up this suggestion. I personally don't support taking on See #5867 for some related discussion. Some things you could try to speed up predictions in the situation you've described:
Docs on parameters: https://lightgbm.readthedocs.io/en/latest/Parameters.html Docs on how to use |
thank you for the insights, closing this issue |
This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
When predicting with a large booster and a large dataset, the inference can take several hours (at least on my setup). It would be nice if there was the possibility to see a progress.
Ideally I would love to have a tqdm bar of the number of samples.
I am willing to accept some slowdown for the information, but putting my samples in chunks and then calling predict on each chunk wrapped by a tqdm ... I hope that there is some better way.
Thank you!
The text was updated successfully, but these errors were encountered: