Skip to content

Token dropout/limits #250

Answered by O-J1
yggdrasil75 asked this question in Q&A
Discussion options

You must be logged in to vote

I have some captions with over 1000 tokens so that I be extremely specific in my prompts and get highly detailed results (same reason I am training at 1536-2048). I was wondering about this though: I know that stable diffusion is always trained on ~75 tokens, and its not a model limit but a training issue for prompts to not be higher quality with more descriptions.

there is the option for "keep tag count" which just doesnt shuffle the first n tags when shuffle is enabled (based on description), but does the tool do anything else automatically to the caption? does it drop the end of the prompt? Can it be made to where the prompt length is randomized, ie: sometimes 75 tokens, sometimes 150…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by O-J1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants