-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow multiple experiments with the same configuration #3
Comments
Hi! May I ask you, if it is really different experiments, or it is the same experiment (with exactly the same model and everything), but used with another data or something? |
It's the same experiemnt with the same model, the same hyperparameters, and the same data. My primary reason is that it will simplify the develpment, as you run the same code over and over again. The second reason is that sometimes you'd want to average results of several experiemnts with the same settings but different random seed. |
I see. For these purposes, Experiment has "implicit_resuming" flag, which, when set to True, allows running the same experiment without explicitly passing "resume_from" argument. I typically use it when I need to run the same model on different folds. |
It might work for development, but it would be nice to have a dedicated experiment with an uniq id for each model. Anyway, I'll try the "implicit_resuming" to see how it works... |
Hi,
Thank you for such a great library, just what I was looking for!
However, quite often I need to run multiple experiments with the same configuration. As for now, it is not possbile.
I think it would make sense to assing an unique identifier to an experiment and not configuration.
I've drafted this functionality here: jgc128@6e493b4
Every time a user runs an experiment, it generates a random uniq folder for it, and an optional
experiment_prefix
parameter allows to easily distinguish between related but different experiments.Is it something you would be interested in? I can open a pull request if you'd like
The text was updated successfully, but these errors were encountered: