You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using the Cross Validation to estimate the performance for my model, right now the way I am using it is ClassificationMetrics vm = new Validator<>(ClassificationMetrics.class, configuration).validate(new KFoldSplitter(10).split(trainingDataframe), new MultinomialNaiveBayes.TrainingParameters());
in the com.datumbox.framework.core.machinelearning.common.abstracts.algorithms.AbstractNaiveBayes, I see there's a setLogPriors function which can probably be used to tune the model. (I want to create a DET graph for the model performance, by playing around with the prior probability). Is there a way to set the prior probability of different labels for cross validation? Thanks.
The text was updated successfully, but these errors were encountered:
Not at the moment. Everything inside the ModelParameters classes is estimated during training period. You can indirectly affect the prior probabilities by resampling the dataset but you can't directly set them. This is a limitation that can be addressed on upcoming versions.
I am using the Cross Validation to estimate the performance for my model, right now the way I am using it is
ClassificationMetrics vm = new Validator<>(ClassificationMetrics.class, configuration).validate(new KFoldSplitter(10).split(trainingDataframe), new MultinomialNaiveBayes.TrainingParameters());
in the
com.datumbox.framework.core.machinelearning.common.abstracts.algorithms.AbstractNaiveBayes
, I see there's asetLogPriors
function which can probably be used to tune the model. (I want to create a DET graph for the model performance, by playing around with the prior probability). Is there a way to set the prior probability of different labels for cross validation? Thanks.The text was updated successfully, but these errors were encountered: