Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A better method to enforce monotonic constraints in regression and classification trees #7073

Open
JoshuaC3 opened this issue Jul 2, 2021 · 1 comment

Comments

@JoshuaC3
Copy link

JoshuaC3 commented Jul 2, 2021

Here is a paper called: A better method to enforce monotonic constraints in regression and classification trees

https://arxiv.org/abs/2011.00986

It seems to suggest that standard monotonic constraints to tree based models constrain the model too tightly and harshly impact predictive accuracy. They propose two new methods to improve on the current methods. Here is how much better it can perform on the The Adult dataset. Monotonic constraints were on: age, education and hours_per_week. You could see how these features could be legally required to be monotonic in certain regulated spaces:

image

Here are the examples of the changes added to LightGBM:
microsoft/LightGBM#3264
microsoft/LightGBM#2939
microsoft/LightGBM#2770

The Parameters for the model are as follows. I feel this should give a good quick high-level explanation of how/why the new functionality might work:
https://github.com/microsoft/LightGBM/blob/aab8fc18a259ca9ca95db5bac203b81cb2e1ad49/docs/Parameters.rst#monotone_constraints_method

Here is a nice theoretical example:
image

@trivialfis
Copy link
Member

Thanks for raising the issue with such details! Let me do some reading.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants