Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v22.6.0 #1907

Merged
merged 48 commits into from
Jan 27, 2024
Merged

v22.6.0 #1907

merged 48 commits into from
Jan 27, 2024

Conversation

bmaltais
Copy link
Owner

Hume amounts of updates... see README.md for details

akx and others added 30 commits January 16, 2024 14:32
Fix typo `--spda` (it's `--sdpa`)
fix vram usage in LoRA training
…rain_network (or sdxl_train_network) (#1057)

* Add fp8 support

* remove some debug prints

* Better implementation for te

* Fix some misunderstanding

* as same as unet, add explicit convert

* better impl for convert TE to fp8

* fp8 for not only unet

* Better cache TE and TE lr

* match arg name

* Fix with list

* Add timeout settings

* Fix arg style

* Add custom seperator

* Fix typo

* Fix typo again

* Fix dtype error

* Fix gradient problem

* Fix req grad

* fix merge

* Fix merge

* Resolve merge

* arrangement and document

* Resolve merge error

* Add assert for mixed precision
Update Chinese Documentation
Deduplicate ipex initialization code
Avoid grad sync on each step even when doing accumulation
@bmaltais bmaltais merged commit 62fbae6 into master Jan 27, 2024
0 of 2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants