Skip to content

Can't get example working in "NLP support with Huggingface tokenizers" #2507

Answered by frankfliu
nezda asked this question in Q&A
Discussion options

You must be logged in to vote

You do need specify a TranslatorFactory if your model is in a zip file. You can either:

  1. use ai.djl.huggingface.translator.QuestionAnsweringTranslatorFactory directory,
  2. use DeferredTranslatorFactory, it will read the configurations from serving.properties.
Criteria<QAInput, String> criteria = Criteria.builder()
    .setTypes(QAInput.class, String.class)
    .optModelPath(Paths.get("model/nlp/question_answer/ai/djl/huggingface/pytorch/deepset/bert-base-cased-squad2/0.0.1/bert-base-cased-squad2.zip"))
    .optTranslatorFactory(new DeferredTranslatorFactory())
    .optProgress(new ProgressBar()).build();

Replies: 4 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by nezda
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants