Skip to content

Commit

Permalink
Improve error message for when an LLM base model can't be loaded. (#3675
Browse files Browse the repository at this point in the history
)
  • Loading branch information
justinxzhao authored Sep 28, 2023
1 parent 07570ca commit e686abb
Showing 1 changed file with 5 additions and 3 deletions.
8 changes: 5 additions & 3 deletions ludwig/schema/llms/base_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,9 +59,11 @@ def validate(model_name: str):
return model_name
except OSError:
raise ConfigValidationError(
f"Specified base model `{model_name}` is not a valid pretrained CausalLM listed on huggingface "
"or a valid local directory containing the weights for a pretrained CausalLM from huggingface."
"Please see: https://huggingface.co/models?pipeline_tag=text-generation&sort=downloads"
f"Specified base model `{model_name}` could not be loaded. If this is a private repository, make "
f"sure to set HUGGING_FACE_HUB_TOKEN in your environment. Check that {model_name} is a valid "
"pretrained CausalLM listed on huggingface or a valid local directory containing the weights for a "
"pretrained CausalLM from huggingface. See: "
"https://huggingface.co/models?pipeline_tag=text-generation&sort=downloads for a full list."
)
raise ValidationError(
f"`base_model` should be a string, instead given: {model_name}. This can be a preset or any pretrained "
Expand Down

0 comments on commit e686abb

Please sign in to comment.