Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

!!! Exception during processing !!! 'tokenizers.AddedToken' object has no attribute 'special' #5561

Open
npubg opened this issue Nov 10, 2024 · 1 comment
Labels
User Support A user needs help with something, probably not a bug.

Comments

@npubg
Copy link

npubg commented Nov 10, 2024

Your question

Error while loading checkpoint:!!! Exception during processing !!! 'tokenizers.AddedToken' object has no attribute 'special'

Logs

got prompt
model weight dtype torch.float16, manual cast: None
model_type EPS
Using pytorch attention in VAE
Using pytorch attention in VAE
Ignored unknown kwarg option special
Ignored unknown kwarg option special
Ignored unknown kwarg option special
!!! Exception during processing !!! 'tokenizers.AddedToken' object has no attribute 'special'
Traceback (most recent call last):
  File "D:\ComfyUI\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "D:\ComfyUI\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\nodes.py", line 549, in load_checkpoint
    out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\comfy\sd.py", line 611, in load_checkpoint_guess_config
    out = load_state_dict_guess_config(sd, output_vae, output_clip, output_clipvision, embedding_directory, output_model, model_options, te_model_options=te_model_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\comfy\sd.py", line 665, in load_state_dict_guess_config
    clip = CLIP(clip_target, embedding_directory=embedding_directory, tokenizer_data=clip_sd, parameters=parameters, model_options=te_model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\comfy\sd.py", line 93, in __init__
    self.tokenizer = tokenizer(embedding_directory=embedding_directory, tokenizer_data=tokenizer_data)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\comfy\sdxl_clip.py", line 26, in __init__
    self.clip_l = clip_l_tokenizer_class(embedding_directory=embedding_directory)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\ComfyUI\comfy\sd1_clip.py", line 417, in __init__
    self.tokenizer = tokenizer_class.from_pretrained(tokenizer_path)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\python_embeded\Lib\site-packages\transformers\tokenization_utils_base.py", line 2213, in from_pretrained
    return cls._from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\python_embeded\Lib\site-packages\transformers\tokenization_utils_base.py", line 2447, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI\python_embeded\Lib\site-packages\transformers\models\clip\tokenization_clip.py", line 323, in __init__
    super().__init__(
  File "D:\ComfyUI\python_embeded\Lib\site-packages\transformers\tokenization_utils.py", line 439, in __init__
    self._add_tokens(
  File "D:\ComfyUI\python_embeded\Lib\site-packages\transformers\tokenization_utils.py", line 569, in _add_tokens
    if not token.special and token.normalized and getattr(self, "do_lower_case", False):
           ^^^^^^^^^^^^^
AttributeError: 'tokenizers.AddedToken' object has no attribute 'special'

Prompt executed in 1.43 seconds

Other

No response

@npubg npubg added the User Support A user needs help with something, probably not a bug. label Nov 10, 2024
@ltdrdata
Copy link
Collaborator

ltdrdata commented Nov 10, 2024

What is your checkpoint model?
Where did you download it from?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
User Support A user needs help with something, probably not a bug.
Projects
None yet
Development

No branches or pull requests

2 participants