How to use sd_embed to fix Token indices sequence length is longer than the specified maximum sequence length for this model (95 > 77) #10794
-
@asomoza Working code with token error and log without sd_embed
Not working code with sd_embed / same token error
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
sadly I can't help you because of a number of reasons, but the most important one is that I'm not the author or involved with that library, I did test it with SDXL and it worked without problems. I didn't test it with Flux because it came at a later date, I almost never use Flux except for tests and I'm not a fan of using long prompts with the clip models. I see you opened an issue with the author, and probably you will get a better solution there than here since we only can help with diffusers and the correct use of the models, this means, we can't help you with using more tokens than what the model supports (because it's hacky and difficult to test) or with external libraries, this is not because we don't want to but because we really don't have the spare time to do it. P.S.: The warning you're getting is not one from diffusers and it's not the same error, my guess is that for the last batch of tokens the library is not correctly concatenating the tokens. |
Beta Was this translation helpful? Give feedback.
Thank you @asomoza for looping me. @nitinmukesh you can safely ignore the warning "Token indices sequence length is longer than the specified maximum sequence length for this model". it is spit out from tokenizer, actually all your tokens will be used in the embedding process