Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve configurability of training hyperparameters #42

Open
aribrill opened this issue Jul 19, 2018 · 0 comments
Open

Improve configurability of training hyperparameters #42

aribrill opened this issue Jul 19, 2018 · 0 comments

Comments

@aribrill
Copy link
Collaborator

At present, most of the training hyperparameters are required config options. Make these optional, providing reasonable defaults.

Add the ability to choose any optimizer available in TensorFlow (or at least all the commonly used ones), as well as the ability to configure their parameters. Since there are a variety of optimizers and they all take different arguments, this could be accomplished by just having an optimizer_arguments dictionary that is passed directly to the optimizer on initialization without any intermediate parsing. This would replace the current adam_epsilon configuration parameter.

Add options for additional training hyperparameters such as regularization type and strength and learning rate annealing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant