Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Is Microsoft still supporting this project? #6128

Closed
julioasotodv opened this issue Oct 6, 2023 · 8 comments
Closed

[Question] Is Microsoft still supporting this project? #6128

julioasotodv opened this issue Oct 6, 2023 · 8 comments
Labels

Comments

@julioasotodv
Copy link
Contributor

Hi all!

I know for a fact that @jameslamb is doing an amazing job maintaining this project.

With that said, I am starting to get a bit concerned about how Microsoft is supporting this project in terms of (wo)man hours and overall effort. Given that this project lives under the microsoft GH organization, I would assume that Microsoft as a company allocates resources in order to improve and maintain LightGBM. But let me tell you that as an outsider, it definitely does not look like this is the case.

With XGBoost 2.0 a great amount of features that used to be LGBM's "special sauce" are now also available in the aforementioned library (for me personally the only feature missing in XGBoost is the linear_tree feature).

This may look like a rant, but I would really love to have a Microsoft representative's take on this topic; and perhaps a bit on transparency from their side on the plans to keep LGBM with the same quality standards and interesting innovations than it used to have.

Thank you all!

@jameslamb
Copy link
Collaborator

Thanks for your interest in LightGBM.

@shiyu1994 is our representative from Microsoft, and I'll let him speak to how Microsoft is supporting the project and the level of support it plans to offer in the future.

@mayer79
Copy link
Contributor

mayer79 commented Oct 8, 2023

Not speaking for Microsoft, but: To me, the main advantage of LightGBM remains its increadible speed. XGBoost has certainly catched up lately, but LGB still appears to be way faster. In the future, I'd love to work a bit on LGB's R interface.

@julioasotodv
Copy link
Contributor Author

@mayer79 is it? Maybe for CPU it is still slightly faster than XGB if compiled aggressively, but GPU implementation for XGB (or CatBoost for that matter) is objectively better

@shiyu1994
Copy link
Collaborator

@julioasotodv Thanks for your interest in LightGBM.
The short answer for your question: yes, we are definitely supporting LightGBM. Here are some of our updates from our team within Microsoft in the past year:

These may not seem to be a lot of new features. But we are making our algorithm lighter and faster with non-trivial technical innovations. And we are working further to make these features more stable and out-of-the-box. For example, we have a plan to include the CUDA builds in the released packages so that no manual installation is needed to run LightGBM with GPUs, and our multi-GPU support is on the way to this main repo.

For the latest benchmark of training efficiency in CPUs and GPUs, you may refer to the tables in our paper
https://arxiv.org/abs/2207.09682.
With our new GPU version and together with quantized training, we see an overall training speedup up to 3x compared with the faster one of XGB and CatBoost.

LightGBM is a precious open-source project, and we will keep the effort on it to maintain its excellence. And if you have any suggestions, please feel free to post here or contact us. Thanks again!

@adfea9c0
Copy link

adfea9c0 commented Oct 10, 2023

@mayer79 is it? Maybe for CPU it is still slightly faster than XGB if compiled aggressively, but GPU implementation for XGB (or CatBoost for that matter) is objectively better

@julioasotodv When did you test this? When I compared XGBoost/LightGBM on GPU recently I found LightGBM noticably faster.

@onacrame
Copy link

I’ve traditionally used lightgbm models in a corporate setting but have to say some of the features in other GBDT libraries do appeal. I think the following features would be great to have in future versions of lightgbm (and are already implemented in other GBDT libraries).

-multioutput aggression, different targets but also multiple outputs from the same tree structure (for example a multiclass classification model) as per SketchBoost as well as XGBoost which has this feature in beta.

-uncertainty estimation (absolutely critical in finance and medical fields). Yes there is quantile regression but quantile regression is not particular efficient. See for example what CatBoost has done here but even better would be an approach based on conformal prediction which provides validity guarantees.

-out of core training (see recent XGBoost release)

-faster inference. See what packages like treelite and lleaves have done to increase inference speed by orders of magnitude

Overall love the package but would like to see more resource put behind it. LLMs are all the rage these days but the overwhelming majority of the world’s data is still tabular data and strong support is needed for tabular state of the art models.

@jameslamb
Copy link
Collaborator

It's been a month with no new posts here, so I'm going to close this.

For discussion of which feature requests you'd like to see prioritized, please comment on #2302.

If you want to request a new feature that you don't yet see in #2302, add a new feature request at https://github.com/microsoft/LightGBM/issues/new/choose.

would like to see more resource put behind it

This is an open source project and we welcome contributions from anyone. Please do consider contributing code and documentation to the project. If you're unsure where to start, browse the open issues at https://github.com/microsoft/LightGBM/issues.

Copy link

This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 20, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

6 participants