Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to setLogPriors for Naive Bayes model during cross validation? #28

Open
jltchiu opened this issue Feb 20, 2018 · 1 comment
Open
Milestone

Comments

@jltchiu
Copy link

jltchiu commented Feb 20, 2018

I am using the Cross Validation to estimate the performance for my model, right now the way I am using it is ClassificationMetrics vm = new Validator<>(ClassificationMetrics.class, configuration).validate(new KFoldSplitter(10).split(trainingDataframe), new MultinomialNaiveBayes.TrainingParameters());

in the com.datumbox.framework.core.machinelearning.common.abstracts.algorithms.AbstractNaiveBayes, I see there's a setLogPriors function which can probably be used to tune the model. (I want to create a DET graph for the model performance, by playing around with the prior probability). Is there a way to set the prior probability of different labels for cross validation? Thanks.

@datumbox
Copy link
Owner

Not at the moment. Everything inside the ModelParameters classes is estimated during training period. You can indirectly affect the prior probabilities by resampling the dataset but you can't directly set them. This is a limitation that can be addressed on upcoming versions.

@datumbox datumbox added this to the Future milestone Feb 22, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants