Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorporate minibatch-training #11

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Incorporate minibatch-training #11

wants to merge 1 commit into from

Conversation

hsmaan
Copy link
Member

@hsmaan hsmaan commented Jan 22, 2024

Currently, minibatch training is not incorporated because of limitations in the graph-base training setup.

Aim of PR is to:

  • Enable minibatch training in the model
  • Test stability of minibatch training, especially with current hyperparameter setup of the model
  • After stability ensured, incorporate explicit parallelization (DataParallel already available in BaseTrainer, but DDP not implemented) for multi-gpu training

@hsmaan hsmaan added the enhancement New feature or request label Jan 22, 2024
@hsmaan hsmaan requested a review from subercui January 22, 2024 21:14
@hsmaan hsmaan mentioned this pull request Jan 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants