Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community: add PanelCallbackHandler #27039

Open
wants to merge 19 commits into
base: master
Choose a base branch
from
259 changes: 259 additions & 0 deletions docs/docs/integrations/callbacks/panel.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,259 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Panel\n",
"\n",
"> [Panel](https://panel.holoviz.org/index.html) is an open-source Python library designed to streamline the development of robust tools, dashboards, and complex applications entirely within Python. With a comprehensive philosophy,\n",
"> Panel integrates seamlessly with the PyData ecosystem, offering powerful, interactive data tables, visualizations, and much more, to unlock, visualize, share, and collaborate on your data for efficient workflows.\n",
"\n",
"In this guide, we will go over how to setup the `PanelCallbackHandler`. The `PanelCallbackHandler` is useful for rendering and streaming the chain of thought from Langchain objects like Tools, Agents, and Chains. It inherits from Langchain’s BaseCallbackHandler.\n",
"\n",
"Check out the panel-chat-examples docs to see more examples on how to use PanelCallbackHandler. If you have an example to demo, we’d love to add it to the panel-chat-examples gallery!\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Installation and Setup\n",
"\n",
"```bash\n",
"pip install langchain panel\n",
"```\n",
"\n",
"See full instructions in Panel's [Getting started documentation](https://panel.holoviz.org/getting_started/index.html).\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Basic chat with an LLM\n",
"\n",
"To get started:\n",
"\n",
"1. Define a chat callback, like `respond` here.\n",
"2. Pass the instance of a `ChatFeed` or `ChatInterface` to `PanelCallbackHandler`.\n",
"3. Pass the `callback_handler` as a list into `callbacks` when constructing or using Langchain objects like `ChatOpenAI` here."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import panel as pn\n",
"from langchain_community.callbacks import PanelCallbackHandler\n",
"from langchain_openai import ChatOpenAI\n",
"\n",
"pn.extension()\n",
"\n",
"\n",
"def respond(contents):\n",
" llm.invoke(contents)\n",
"\n",
"\n",
"chat_interface = pn.chat.ChatInterface(callback=respond)\n",
"callback = PanelCallbackHandler(chat_interface)\n",
"llm = ChatOpenAI(model_name=\"gpt-4o-mini\", streaming=True, callbacks=[callback])\n",
"chat_interface"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This example shows the response from the LLM only. A LLM by it self does not show any chain of thought. Later we will build an agent that uses tools. This will show chain of thought."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Async chat with an LLM\n",
"\n",
"Using `async` prevents blocking the main thread, enabling concurrent interactions with the app. This improves responsiveness and user experience.\n",
"\n",
"To do so:\n",
"\n",
"1. Prefix the function with `async`\n",
"2. Prefix the call with `await`\n",
"3. Use `ainvoke` instead of `invoke`"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import panel as pn\n",
"from langchain_community.callbacks import PanelCallbackHandler\n",
"from langchain_openai import ChatOpenAI\n",
"\n",
"pn.extension()\n",
"\n",
"\n",
"async def respond(contents):\n",
" await llm.ainvoke(contents)\n",
"\n",
"\n",
"chat_interface = pn.chat.ChatInterface(callback=respond)\n",
"callback = PanelCallbackHandler(chat_interface)\n",
"llm = ChatOpenAI(model_name=\"gpt-4o-mini\", streaming=True, callbacks=[callback])\n",
"chat_interface"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Agents with Tools\n",
"\n",
"Agents and tools can also be used. Simply pass callback to the `AgentExecutor` and its `invoke` method."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import panel as pn\n",
"from langchain import hub\n",
"from langchain.agents import AgentExecutor, create_react_agent, load_tools\n",
"from langchain_community.callbacks import PanelCallbackHandler\n",
"from langchain_openai import ChatOpenAI\n",
"\n",
"pn.extension()\n",
"\n",
"\n",
"def respond(contents):\n",
" agent_executor.invoke({\"input\": contents}, {\"callbacks\": [callback]})\n",
"\n",
"\n",
"chat_interface = pn.chat.ChatInterface(callback=respond)\n",
"callback = PanelCallbackHandler(chat_interface)\n",
"llm = ChatOpenAI(model_name=\"gpt-4o-mini\", streaming=True, callbacks=[callback])\n",
"tools = load_tools([\"ddg-search\"])\n",
"prompt = hub.pull(\"hwchase17/react\")\n",
"agent = create_react_agent(llm, tools, prompt)\n",
"agent_executor = AgentExecutor(agent=agent, tools=tools, callbacks=[callback])\n",
"\n",
"chat_interface"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Chain with Retrievers\n",
"\n",
"RAG is also possible; simply pass `callback` again. Then ask it what the secret number is!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from uuid import uuid4\n",
"\n",
"import panel as pn\n",
"from langchain.prompts import ChatPromptTemplate\n",
"from langchain.schema.runnable import RunnablePassthrough\n",
"from langchain.text_splitter import CharacterTextSplitter\n",
"from langchain.vectorstores import Chroma\n",
"from langchain_community.callbacks import PanelCallbackHandler\n",
"from langchain_openai import ChatOpenAI, OpenAIEmbeddings\n",
"\n",
"TEXT = \"The secret number is 888.\"\n",
"\n",
"TEMPLATE = \"\"\"Answer the question based only on the following context:\n",
"\n",
"{context}\n",
"\n",
"Question: {question}\n",
"\"\"\"\n",
"\n",
"pn.extension(design=\"material\")\n",
"\n",
"\n",
"@pn.cache\n",
"def get_vector_store():\n",
" text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=100)\n",
" texts = text_splitter.split_text(TEXT)\n",
" embeddings = OpenAIEmbeddings()\n",
" db = Chroma.from_texts(texts, embeddings)\n",
" return db\n",
"\n",
"\n",
"def get_chain(callbacks):\n",
" retriever = db.as_retriever(callbacks=callbacks)\n",
" model = ChatOpenAI(callbacks=callbacks, streaming=True)\n",
"\n",
" def format_docs(docs):\n",
" text = \"\\n\\n\".join([d.page_content for d in docs])\n",
" return text\n",
"\n",
" def hack(docs):\n",
" # https://github.com/langchain-ai/langchain/issues/7290\n",
" for callback in callbacks:\n",
" callback.on_retriever_end(docs, run_id=uuid4())\n",
" return docs\n",
"\n",
" return (\n",
" {\"context\": retriever | hack | format_docs, \"question\": RunnablePassthrough()}\n",
" | prompt\n",
" | model\n",
" )\n",
"\n",
"\n",
"async def respond(contents):\n",
" chain = get_chain(callbacks=[callback])\n",
" await chain.ainvoke(contents)\n",
"\n",
"\n",
"db = get_vector_store()\n",
"prompt = ChatPromptTemplate.from_template(TEMPLATE)\n",
"chat_interface = pn.chat.ChatInterface(callback=respond)\n",
"callback = PanelCallbackHandler(chat_interface)\n",
"\n",
"chat_interface"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.8"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
5 changes: 5 additions & 0 deletions libs/community/langchain_community/callbacks/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,9 @@
from langchain_community.callbacks.openai_info import (
OpenAICallbackHandler,
)
from langchain_community.callbacks.panel_callback import (
PanelCallbackHandler,
)
from langchain_community.callbacks.promptlayer_callback import (
PromptLayerCallbackHandler,
)
Expand Down Expand Up @@ -105,6 +108,7 @@
"LabelStudioCallbackHandler": "langchain_community.callbacks.labelstudio_callback",
"MlflowCallbackHandler": "langchain_community.callbacks.mlflow_callback",
"OpenAICallbackHandler": "langchain_community.callbacks.openai_info",
"PanelCallbackHandler": "langchain_community.callbacks.panel_callback",
"PromptLayerCallbackHandler": "langchain_community.callbacks.promptlayer_callback",
"SageMakerCallbackHandler": "langchain_community.callbacks.sagemaker_callback",
"StreamlitCallbackHandler": "langchain_community.callbacks.streamlit",
Expand Down Expand Up @@ -143,6 +147,7 @@ def __getattr__(name: str) -> Any:
"LabelStudioCallbackHandler",
"MlflowCallbackHandler",
"OpenAICallbackHandler",
"PanelCallbackHandler",
"PromptLayerCallbackHandler",
"SageMakerCallbackHandler",
"StreamlitCallbackHandler",
Expand Down
Loading
Loading