You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At present, most of the training hyperparameters are required config options. Make these optional, providing reasonable defaults.
Add the ability to choose any optimizer available in TensorFlow (or at least all the commonly used ones), as well as the ability to configure their parameters. Since there are a variety of optimizers and they all take different arguments, this could be accomplished by just having an optimizer_arguments dictionary that is passed directly to the optimizer on initialization without any intermediate parsing. This would replace the current adam_epsilon configuration parameter.
Add options for additional training hyperparameters such as regularization type and strength and learning rate annealing.
The text was updated successfully, but these errors were encountered:
At present, most of the training hyperparameters are required config options. Make these optional, providing reasonable defaults.
Add the ability to choose any optimizer available in TensorFlow (or at least all the commonly used ones), as well as the ability to configure their parameters. Since there are a variety of optimizers and they all take different arguments, this could be accomplished by just having an optimizer_arguments dictionary that is passed directly to the optimizer on initialization without any intermediate parsing. This would replace the current adam_epsilon configuration parameter.
Add options for additional training hyperparameters such as regularization type and strength and learning rate annealing.
The text was updated successfully, but these errors were encountered: