diff --git a/docs/en/stack/ml/nlp/ml-nlp-shared.asciidoc b/docs/en/stack/ml/nlp/ml-nlp-shared.asciidoc index da0f9a683..0568cda26 100644 --- a/docs/en/stack/ml/nlp/ml-nlp-shared.asciidoc +++ b/docs/en/stack/ml/nlp/ml-nlp-shared.asciidoc @@ -1,16 +1,13 @@ tag::nlp-eland-clone-docker-build[] -You can use the {eland-docs}[Eland client] to install the {nlp} model. Eland -commands can be run in Docker. First, you need to clone the Eland repository -then create a Docker image of Eland: +You can use the {eland-docs}[Eland client] to install the {nlp} model. Use the prebuilt +Docker image to run the Eland install model commands. Pull the latest image with: [source,shell] -------------------------------------------------- -git clone git@github.com:elastic/eland.git -cd eland -docker build -t elastic/eland . +docker pull docker.elastic.co/eland/eland -------------------------------------------------- -After the script finishes, your Eland Docker client is ready to use. +After the pull completes, your Eland Docker client is ready to use. end::nlp-eland-clone-docker-build[] tag::nlp-requirements[] diff --git a/docs/en/stack/ml/nlp/ml-nlp-text-emb-vector-search-example.asciidoc b/docs/en/stack/ml/nlp/ml-nlp-text-emb-vector-search-example.asciidoc index c4c04609f..01b73b2ca 100644 --- a/docs/en/stack/ml/nlp/ml-nlp-text-emb-vector-search-example.asciidoc +++ b/docs/en/stack/ml/nlp/ml-nlp-text-emb-vector-search-example.asciidoc @@ -19,6 +19,11 @@ consists of real questions from the Microsoft Bing search engine and human generated answers for them. The example works with a sample of this data set, uses a model to produce text embeddings, and then runs vector search on it. +You can find +https://github.com/elastic/elasticsearch-labs/blob/main/notebooks/integrations/hugging-face/loading-model-from-hugging-face.ipynb[this example as a Jupyter notebook] +using the Python client in the `elasticsearch-labs` repo. + + [discrete] [[ex-te-vs-requirements]] == Requirements