Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run Phi3.5's "longrope" RoPE scaling type to make Phi3.5 compatible #1792

Closed
BBC-Esq opened this issue Sep 29, 2024 · 0 comments
Closed

Run Phi3.5's "longrope" RoPE scaling type to make Phi3.5 compatible #1792

BBC-Esq opened this issue Sep 29, 2024 · 0 comments

Comments

@BBC-Esq
Copy link

BBC-Esq commented Sep 29, 2024

When running the conversion command it gives me the following error:

ERROR
Starting conversion for bfloat16 with command:
ct2-transformers-converter --model "D:/Scripts/bench_chat/models/Phi-3.5-mini-instruct" --output_dir "D:/Scripts/bench_chat/models\Phi-3.5-mini-instruct-ct2-bfloat16" --quantization bfloat16 --low_cpu_mem_usage --trust_remote_code --copy_files ".cache" "added_tokens.json" "CODE_OF_CONDUCT.md" "configuration_phi3.py" "generation_config.json" "LICENSE" "model.safetensors.index.json" "modeling_phi3.py" "NOTICE.md" "README.md" "SECURITY.md" "special_tokens_map.json" "tokenizer.json" "tokenizer.model" "tokenizer_config.json"
Command failed with return code 1: 
Loading checkpoint shards:   0%|          | 0/2 [00:00<?, ?it/s]
Loading checkpoint shards: 100%|##########| 2/2 [00:00<00:00, 23.81it/s]
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "D:\Scripts\bench_chat\Scripts\ct2-transformers-converter.exe\__main__.py", line 7, in <module>
  File "D:\Scripts\bench_chat\Lib\site-packages\ctranslate2\converters\transformers.py", line 2482, in main
    converter.convert_from_args(args)
  File "D:\Scripts\bench_chat\Lib\site-packages\ctranslate2\converters\converter.py", line 50, in convert_from_args
    return self.convert(
           ^^^^^^^^^^^^^
  File "D:\Scripts\bench_chat\Lib\site-packages\ctranslate2\converters\converter.py", line 89, in convert
    model_spec = self._load()
                 ^^^^^^^^^^^^
  File "D:\Scripts\bench_chat\Lib\site-packages\ctranslate2\converters\transformers.py", line 148, in _load
    spec = loader(model, tokenizer)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Scripts\bench_chat\Lib\site-packages\ctranslate2\converters\transformers.py", line 200, in __call__
    spec = self.get_model_spec(model)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Scripts\bench_chat\Lib\site-packages\ctranslate2\converters\transformers.py", line 1962, in get_model_spec
    raise NotImplementedError(
NotImplementedError: RoPE scaling type 'longrope' is not yet implemented. The following RoPE scaling types are currently supported: linear, su, llama3

Here are some references if it helps!

https://github.com/huggingface/transformers/blob/2e24ee4dfa39cc0bc264b89edbccc373c8337086/src/transformers/modeling_rope_utils.py#L242

https://github.com/huggingface/transformers/blob/2e24ee4dfa39cc0bc264b89edbccc373c8337086/src/transformers/models/phi3/modeling_phi3.py#L251

https://github.com/huggingface/transformers/blob/2e24ee4dfa39cc0bc264b89edbccc373c8337086/src/transformers/models/phi3/configuration_phi3.py#L79

https://arxiv.org/html/2402.13753v1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants