forked from microsoft/Megatron-DeepSpeed
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
merge: microsoft-main
into main
#62
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* Zero Bubble for Megatron-Deepspeed * Remove schedule variant * Integrate zbpp into the megatron * clean up redundant lines * Rename variables * edit README * Undo readme edit * Minor change * Add an example to example directory --------- Co-authored-by: ufotalent <[email protected]> Co-authored-by: Wan Xinyi <[email protected]>
Co-authored-by: Costin Eseanu <[email protected]>
* use split/squeeze instead of slice for performance GPU may not have perf difference but HPU perf improves with this * add copyrights
…urther on device (microsoft#411) * improve performance by keeping attention_mask on device and run ops further on device * add copyrights
* improve RoPE perf by using cached sin/cos tensors * add copyrights
* Extend test utilities to support more accelerators * Add Intel Copyright
* Update arguments.py * Update training.py * Create profiler.py * add copyrights * Update profiler.py * add copyrights * Update help * add copyrights
* Refine wandb logging function * Address comments * enable user to specify wandb local save dir * Update and fix comments * Update
…icrosoft#412) * Update arguments.py * Update training.py * Update utils.py * add copyrights * add copyrights * add copyrights * Update arguments.py help * Update arguments.py * Update training.py * Update utils.py * Update arguments.py
…rocessing (microsoft#421) * Update arguments.py * Update tokenizer.py * Update preprocess_data.py
* Update module.py * Update preprocess_data.py * add copyrights * add copyrights * Update tokenizer.py * add copyrights
This PR adds a Llama universal checkpointing example to examples_deepspeed/universal_checkpointing. It also includes changes to the README, some minor changes, and an update to the TensorBoard analysis script.
…into hzheng-data-fix
…tron-DeepSpeed into hzheng-data-fix
Pull in changes from [6acc370](6acc370) to [`megatron/utils.py`](https://github.com/argonne-lcf/Megatron-DeepSpeed)
[merge]: into `microsoft-main` $\leftarrow$ from `hzheng-data-fix`
should be good to go, I forgot I put a blocker requiring a review before merging into |
hatanp
approved these changes
Nov 12, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
112 Commits, 72 files changed, +7339 -2537
LGTM :-)
Runs well on Sunspot with fresh install and train_aGPT_7B.sh
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR resolves the merge conflicts introduced in #35, containing upstream changes from microsoft/Megatron-DeepSpeed