Skip to content

Commit

Permalink
Missing tokenizer.model error during gguf conversion (ggerganov#6443)
Browse files Browse the repository at this point in the history
Co-authored-by: Jared Van Bortel <[email protected]>
  • Loading branch information
overtunned and cebtenzzre authored Apr 3, 2024
1 parent 1ff4d9f commit db214fa
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions convert-hf-to-gguf.py
Original file line number Diff line number Diff line change
Expand Up @@ -323,8 +323,7 @@ def _set_vocab_sentencepiece(self):
toktypes: list[int] = []

if not tokenizer_path.is_file():
print(f'Error: Missing {tokenizer_path}', file=sys.stderr)
sys.exit(1)
raise FileNotFoundError(f"File not found: {tokenizer_path}")

tokenizer = SentencePieceProcessor(str(tokenizer_path))
vocab_size = self.hparams.get('vocab_size', tokenizer.vocab_size())
Expand Down

0 comments on commit db214fa

Please sign in to comment.