Skip to content

Commit

Permalink
Update guide for gr.load_chat and allow **kwargs (#10331)
Browse files Browse the repository at this point in the history
* changes

* changes

* format

* add changeset

* add changeset

* docs

---------

Co-authored-by: gradio-pr-bot <[email protected]>
  • Loading branch information
abidlabs and gradio-pr-bot authored Jan 10, 2025
1 parent 343503d commit decb594
Show file tree
Hide file tree
Showing 3 changed files with 16 additions and 7 deletions.
5 changes: 5 additions & 0 deletions .changeset/some-cases-notice.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"gradio": patch
---

fix:Update guide for `gr.load_chat` and allow `**kwargs`
14 changes: 9 additions & 5 deletions gradio/external.py
Original file line number Diff line number Diff line change
Expand Up @@ -592,15 +592,17 @@ def load_chat(
*,
system_message: str | None = None,
streaming: bool = True,
**kwargs,
) -> ChatInterface:
"""
Load a chat interface from an OpenAI API chat compatible endpoint.
Parameters:
base_url: The base URL of the endpoint.
model: The model name.
token: The API token.
system_message: The system message for the conversation, if any.
base_url: The base URL of the endpoint, e.g. "http://localhost:11434/v1/"
model: The name of the model you are loading, e.g. "llama3.2"
token: The API token or a placeholder string if you are using a local model, e.g. "ollama"
system_message: The system message to use for the conversation, if any.
streaming: Whether the response should be streamed.
kwargs: Additional keyword arguments to pass into ChatInterface for customization.
"""
try:
from openai import OpenAI
Expand Down Expand Up @@ -645,4 +647,6 @@ def open_api_stream(
response += chunk.choices[0].delta.content
yield response

return ChatInterface(open_api_stream if streaming else open_api, type="messages")
return ChatInterface(
open_api_stream if streaming else open_api, type="messages", **kwargs
)
4 changes: 2 additions & 2 deletions guides/05_chatbots/01_creating-a-chatbot-fast.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@ If you have a chat server serving an OpenAI-API compatible endpoint (e.g. Ollama
```python
import gradio as gr

gr.load_chat("http://localhost:11434/v1/", model="llama3.2", token=None).launch()
gr.load_chat("http://localhost:11434/v1/", model="llama3.2", token="ollama").launch()
```

If not, don't worry, keep reading to see how to create an application around any chat model!
If you have your own model, keep reading to see how to create an application around any chat model in Python!

## Defining a chat function

Expand Down

0 comments on commit decb594

Please sign in to comment.