Skip to content

Commit

Permalink
Hotfix: Update "how to use" api docs (#4)
Browse files Browse the repository at this point in the history
* add usage section

* change md to mdx

* add usage section redirect

* update general readme

* fix learn body json
  • Loading branch information
BorjaZarco authored Jul 21, 2023
1 parent e5c3092 commit 5b235da
Show file tree
Hide file tree
Showing 5 changed files with 127 additions and 81 deletions.
17 changes: 12 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,13 @@ Its objective is to create an ecosystem of specific and finished components that

## Table of Contents

- [Why not use other tools (Langchain, ChromaDB, Pinecone, etc.)?](#why-not-use-other-tools-of-the-market-langchain-chromadb-pinecone-etc)
- [Table of Contents](#table-of-contents)
- [Why not use other tools (Langchain, ChromaDB, Pinecone, etc.)?](#why-not-use-other-tools-langchain-chromadb-pinecone-etc)
- [What can I do with eLLMental now?](#what-can-i-do-with-ellmental-now)
- [What can I do with a Semantic Search service?](#what-can-i-do-with-a-semantic-search-service)
- [What can I do with a semantic search service?](#what-can-i-do-with-a-semantic-search-service)
- [Quick Start Guide](#quick-start-guide)
- [Running the service](#running-the-service)
- [Using the service](#using-the-service)
- [Contributing and Community](#contributing-and-community)
- [License](#license)
- [Contact](#contact)
Expand Down Expand Up @@ -75,6 +78,8 @@ You can use a semantic search for a huge variety of use cases. These are some of
In this guide, you'll have your very own instance of a semantic search service up and running on your computer,
powered by OpenAI embeddings and the practicality of a local database. The whole process takes less than 5 minutes! 🌈

### Running the service

To begin working locally with **eLLMental**, you'll need to first run the `quickstart.sh` script, which underneath calls
a local Docker instance. Said that here's a summary of the steps you'll need to follow:

Expand All @@ -88,11 +93,13 @@ just running: `./quickstart.sh`
> This script will ask you for your OPENAI API key, and then it will start the service using
> docker.
### Using the service

Once the docker container is running, you will get a semantic search service running exposing three endpoints:

1. The documentation endpoint, with information about how to use the API: `http://localhost:8000/docs`
2. The learn endpoint: `http://localhost:8000/learn`
3. The search endpoint: `http://localhost:8000/search`
1. The documentation endpoint, with information about how to use the API: [http://localhost:8000/docs](http://localhost:8000/docs)
2. The learn endpoint: [http://localhost:8000/learn](http://localhost:8000/learn)
3. The search endpoint: [http://localhost:8000/search](http://localhost:8000/search)

In our [documentation site](https://python.ellmental.com) you will find more information about the capabilities of the service. Like for example, how to use Azure OpenAI to generate the embeddings, or how to make use of your own database.

Expand Down
74 changes: 4 additions & 70 deletions website/docs/02_quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ sidebar_label: Quickstart
In this guide, you'll have your very own instance of a semantic search service up and running on your computer,
powered by OpenAI embeddings and the practicality of a local database. The whole process takes less than 5 minutes! 🌈

## Running the service

To begin working locally with **eLLMental**, you'll need to first run the `quickstart.sh` script, which underneath calls
a local Docker instance. Said that here's a summary of the steps you'll need to follow:

Expand All @@ -22,74 +24,6 @@ just running: `./quickstart.sh`
> This script will ask you for your OPENAI API key, and then it will start the service using
> docker.
Once the docker container is running, you will get a semantic search service running exposing two endpoints:

1. The documentation endpoint, with information about how to use the API: `http://localhost:8000/docs`
2. [The learn endpoint](#learn-endpoint): `http://localhost:8000/learn`
3. [The search endpoint](#search-endpoint): `http://localhost:8000/search`

## Learn endpoint

With the `learn` endpoint, you can make your service learn from a set of documents' content (or any text you'd like!). You
just need to provide the endpoint with `content`, so it can then be turned into embeddings and stored in the local database.

You can try the service by sending a POST request to `http://localhost:8000/learn` with the following body:

```json
{
"items": [
{
"content": "Grain in dogs' food is not good for them.",
"metadata": {"key1": "value1", "key2": "value2"},
"cluster_id": "your_file_id"
},
...
]
}
```

---

**CURL command:**

```bash
curl --location 'http://127.0.0.1:8000/learn' \
--header 'Content-Type: application/json' \
--data '{
"items": [
{
"content": "Grain in dogs'\'' food is not good for them.",
"metadata": {"key1": "value1", "key2": "value2"},
"cluster_id": "your_file_id"
}
]
}'
```

## Search endpoint

With the search endpoint, you will get answers to your questions, based on what you have ingested with
the `learn` endpoint. You just need to provide the endpoint with a `query`, and the service will return the most
relevant information based on the embeddings stored in the local database.

You can try this endpoint by sending a POST request to `http://localhost:8000/search` with the following body:

```json
{
"query": "How does grain food affect dogs?",
"cluster_ids": ["your_file_id"]
}
```

---

**CURL command:**
## Using the service

```bash
curl --location 'http://127.0.0.1:8000/search' \
--header 'Content-Type: application/json' \
--data '{
"query": "How does grain food affect dogs?",
"cluster_ids": ["your_file_id"]
}'
```
Once the service is up and running, you can start using it by sending requests to the different endpoints. You can find more information about the endpoints in the [Usage](./03_semantic_search/033_semantic_search_usage.mdx) section.
100 changes: 100 additions & 0 deletions website/docs/03_semantic_search/033_semantic_search_usage.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
---
slug: /semantic-search/usage
title: Usage
sidebar_label: Usage
---

import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';

The Semantic Search service can be used through a REST API. You would need to consume it by sending HTTP requests to the different service endpoints. There is an endpoint for every action available in the service.

## OpenAPI documentation

The REST API has been developed following the OpenAPI specification. This means that the REST API complies with the OpenAPI standard and that you can use the OpenAPI documentation to learn how to use the API.

In fact, the Semantic Search service provides an endpoint where you can find the OpenAPI documentation. You can find it at [http://localhost:8000/docs](http://localhost:8000/docs).

This endpoint will open a [Swagger UI](https://swagger.io/) page where you can find all the information about the API. You can also try the API from this page, as it provides a way to send requests to the different endpoints.

Additionally, you can get the openapi.json file from the `/docs` endpoint. You can find it at [http://localhost:8000/openapi.json](http://localhost:8000/openapi.json).

## Available endpoints

The Semantic Search service provides two endpoints: `learn` and `search`. You can find more information about them in the following sections.

### Learn endpoint

With the `learn` endpoint, you can make your service learn from a set of documents' content (or any text you'd like!). You
just need to provide the endpoint with `content`, so it can then be turned into embeddings and stored in the local database.

You can try the service by sending a `POST` request to [http://localhost:8000/learn](http://localhost:8000/learn).

<Tabs groupId="api-request">
<TabItem value="json" label="JSON Body" default>

```json
{
"items": [
{
"content": { "text": "Grain in dogs food is not good for them."},
"metadata": {"key1": "value1", "key2": "value2"},
"cluster_id": "your_file_id"
}
]
}
```

</TabItem>
<TabItem value="curl" label="CURL Request">

```bash
curl --location 'http://127.0.0.1:8000/learn' \
--header 'Content-Type: application/json' \
--data '{
"items": [
{
"content": { "text": "Grain in dogs food is not good for them."},
"metadata": {"key1": "value1", "key2": "value2"},
"cluster_id": "your_file_id"
}
]
}'
```

</TabItem>
</Tabs>


### Search endpoint

With the search endpoint, you will get answers to your questions, based on what you have ingested with
the `learn` endpoint. You just need to provide the endpoint with a `query`, and the service will return the most
relevant information based on the embeddings stored in the local database.

You can try this endpoint by sending a `POST` request to [http://localhost:8000/search](http://localhost:8000/search) with the following body:

<Tabs groupId="api-request">
<TabItem value="json" label="JSON Body" default>

```json
{
"query": "How does grain food affect dogs?",
"cluster_ids": ["your_file_id"]
}
```

</TabItem>
<TabItem value="curl" label="CURL Request">

```bash
curl --location 'http://127.0.0.1:8000/search' \
--header 'Content-Type: application/json' \
--data '{
"query": "How does grain food affect dogs?",
"cluster_ids": ["your_file_id"]
}'
```

</TabItem>
</Tabs>
17 changes: 11 additions & 6 deletions website/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,29 +18,34 @@ const sidebars = {
'quickstart',
{
type: 'category',
label: 'Semantic search',
label: 'Semantic Search',
collapsed: false,
link: {
type: 'doc',
id: 'semantic_search/semantic_search',
},
items: ['semantic_search/getting_started', 'semantic_search/embedding_generators', 'semantic_search/embedding_stores'],
items: [
'semantic_search/getting_started',
'semantic_search/semantic_search_usage',
'semantic_search/embedding_generators',
'semantic_search/embedding_stores',
],
},
{
type: 'category',
label: 'Community',
link: {
type: 'doc',
id: 'community/index'
id: 'community/index',
},
items: [
'community/contributing',
{
type: 'doc',
id: 'community/code_of_conduct',
label: 'Code of Conduct'
}
]
label: 'Code of Conduct',
},
],
},
],
};
Expand Down

0 comments on commit 5b235da

Please sign in to comment.