-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New models for softmax #355
base: main
Are you sure you want to change the base?
Conversation
347a9e3
to
8042b51
Compare
Hi Pierre: just wanted to check if you have an estimate for when you think this will be merged. I'd love to use it! No pressure at all. Or as you indicate it's safe to test, so I could locally clone this PR and play around with it? Thanks. |
df973ca
to
16c95a9
Compare
aa2ef69
to
0c981b3
Compare
Hi, I have no estimate currently. The models are working I think but it's a big change and there is a lot of cleanup. Note that what is implemented so far is using softmax for scikit-learn only: multinomial logistic regression and neural networks. I didn't implement counter-parts for Keras and Pytorch yet (the code should be very simple but there would be a lot of testing). I think that it is usable though so if you want to play with it you could try (and more testing cannot hurt). Best, |
95f7a20
to
3c35471
Compare
Thanks, will do! |
363b38d
to
ad4c84c
Compare
dbab218
to
327d178
Compare
a0339f4
to
70b1b78
Compare
Rebase the commit with many conflicts hopefully it is still correct - Add a notebook testing multinomial logistic regression - Do an adversarial model - Put a different revision number - Add a trivial OBBT to multinomial logistic regression notebook - Do hand scaling for MNIST with logistic - Store variables modeling exponential and sum of exponential - Try reworking arguments of logistic regression
Also add a notebook for neural network with softmax activation
Some renaming, I was calling it mixing in neural networks and affinetrans in logistic functions.
It's almost working for Logistic Regression and MLPClassifier
If predict_function is identity we don't apply any softmax
For neural network classifier use out_activation argument to specify what we want to use in last layer. For logistic use predict_function (not sure it is such a good idea yet)
Missing tests and documentation
2259241
to
f75c9e6
Compare
e569c36
to
d269a1f
Compare
Also adds a classification test probably this will be overkill for GH actions but try it.
d269a1f
to
f1f1f34
Compare
Do new models for softmax