You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to request if this converted fp8 of the flux union control net could be supported as it has a 3 gig file size and should allow for faster inferring of generations particularly using the blur and open pose as those aren't officially available from bfl.
On a 16 gig vram pc with 64 gigs ram, the 6 gig one falls into shared vram which slows down generation drastically
This is the fp converted version of the flux started union model
100%. good luck with the app without FP8. even on 24GB vram flux.. your asking for trouble to try use lots of control nets and full size model. and snhell is crap quality and doesn't work.
Is there an existing issue for this?
Contact Details
No response
What should this feature add?
Hi,
I would like to request if this converted fp8 of the flux union control net could be supported as it has a 3 gig file size and should allow for faster inferring of generations particularly using the blur and open pose as those aren't officially available from bfl.
On a 16 gig vram pc with 64 gigs ram, the 6 gig one falls into shared vram which slows down generation drastically
This is the fp converted version of the flux started union model
https://huggingface.co/boricuapab/FLUX.1-dev-Controlnet-Union-fp8/tree/main
These are the speeds I get using the 6 gig model
I tried using the fp8 converted model but am running into this error
Alternatives
No response
Additional Content
No response
The text was updated successfully, but these errors were encountered: