-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better tests for l2 alpha scaling between different sklearn estimators #17
Conversation
4e96964
to
f7fe2d1
Compare
coef, | ||
intercept, | ||
loss=loss, | ||
regul=estimator.alpha * batch_size / X.shape[0], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lol wtf ? Do we inherit this from sklearn's black magic ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes :(
@@ -25,23 +25,25 @@ | |||
from sklearn.metrics import log_loss, mean_squared_error | |||
from sklearn.model_selection import LeaveOneOut | |||
from sklearn.preprocessing import LabelBinarizer, StandardScaler | |||
from statsmodels.genmod import families |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should it be added somewhere in the config files as a required dependency for testing ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's written in the pyproject.toml, when running hatch test, it's going to download additional test-time dependencies. It's done in a hatch centric way, for a more generic way we could introduce a dependency-group https://docs.astral.sh/uv/concepts/projects/dependencies/#development-dependencies
a8c6b16
to
a9f4021
Compare
Hope the new tests in test_linear.py will be enough