You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For neural network if you change the number of features, you need to change the input dimension and therefore the number of neurons.
So, we could have an option like:
leaving='drop' for current behaviour
leaving='replace' for NN
What do you think?
The text was updated successfully, but these errors were encountered:
I have only one concern. Most BN implementations etc make safe normalization but someone may have a custom model which assumes all features to have non-zero std. So replacing with a small random noise could also be an option.
yes, could be different "leaving strategies", I think a string field would give more flexibility. Good point for the batch normalization and the risk of dividing by zero.
Side remark: with NN you will most probably use FLOFO instead of LOFO anyway, at least that fit my current use case, because I have too many features and training takes too long.
Hi,
For neural network if you change the number of features, you need to change the input dimension and therefore the number of neurons.
So, we could have an option like:
What do you think?
The text was updated successfully, but these errors were encountered: