Skip to content

Commit

Permalink
- [Examples] Fixed the prompt and index name in the `llama-index-weav…
Browse files Browse the repository at this point in the history
…iate` example
  • Loading branch information
peterschmidt85 committed Sep 20, 2023
1 parent cd0d230 commit 2be2981
Showing 1 changed file with 25 additions and 5 deletions.
30 changes: 25 additions & 5 deletions docs/examples/llama-index-weaviate.md
Original file line number Diff line number Diff line change
Expand Up @@ -175,10 +175,10 @@ Once `llama_index.VectorStoreIndex` is ready, we can proceed with querying it.
If we're deploying Llama 2, we have to ensure that the prompt format is correct.

```python
from llama_index import QuestionAnswerPrompt
from llama_index import (QuestionAnswerPrompt, RefinePrompt)

prompt = QuestionAnswerPrompt(
"""<s>[INST] <<SYS>>
text_qa_template = QuestionAnswerPrompt(
"""<s>[INST] <<SYS>>
We have provided context information below.
{context_str}
Expand All @@ -187,13 +187,33 @@ Given this information, please answer the question.
<</SYS>>
{query_str} [/INST]"""
)

refine_template = RefinePrompt(
"""<s>[INST] <<SYS>>
The original query is as follows:
{query_str}
We have provided an existing answer:
{existing_answer}
We have the opportunity to refine the existing answer (only if needed) with some more context below.
{context_msg}
<</SYS>>
Given the new context, refine the original answer to better answer the query. If the context isn't useful, return the original answer. [/INST]"""
)

query_engine = index.as_query_engine(
text_qa_template=prompt,
text_qa_template=text_qa_template,
refine_template=refine_template,
streaming=True,
)

response = query_engine.query("What did the author do growing up?")
response = query_engine.query("Make a bullet-point timeline of the authors biography?")
response.print_response_stream()
```

Expand Down

0 comments on commit 2be2981

Please sign in to comment.