Documentation available here
OpenCopilot allows you to have your own product's AI copilot. It integrates with your underlying APIs and is able to execute API calls whenever needed. It uses LLMs to determine if the user's request requires calling an API endpoint. Then, it decides which endpoint to call and passes the appropriate payload based on the given API definition.
- Provide your API/backend definition, including your public endpoints and how to call them. Currently, OpenCopilot supports Swagger OpenAPI 3.0. We're also working on a UI to allow you to dynamically add endpoints.
- OpenCopilot validates your schema to achieve the best results.
- We feed the API definition to an LLM.
- Finally, you can integrate our user-friendly chat bubble into your SaaS app.
-
Make sure you have docker installed.
-
To begin, clone this Git repository:
git clone [email protected]:openchatai/OpenCopilot.git
- Update the
.env
file located in thellm-server
directory with yourOPENAI_API_KEY
. You can use the.env.example
file as a reference:
OPENAI_API_KEY=YOUR_TOKEN_HERE
- gpt-4: Ideal for more complex tasks, but may have slower processing times.
- gpt-3.5-turbo-16k: This model is significantly faster compared to GPT-4.
PLAN_AND_EXECUTE_MODEL=gpt-3.5-turbo-16k
- After updating your API key, navigate to the repository folder and run the following command (for macOS or Linux):
make install
This will install the necessary dependencies and set up the environment for the OpenCopilot project.
Once the installation is complete, you can access the OpenCopilot console at: http://localhost:8888
If needed, you can also restart the local setup using:
make restart
Also, you can see the complete list of commands using
make help
You can try it out on opencopilot.so
final.1.mp4
Watch this video from Shopify:
(OpenCopilot is not affliated with Shopify, and they do not use OpenCopilot, it's just a demo on what copilots are capable of)
- Shopify is developing "Shopify Sidekick."
- Microsoft is working on "Windows Copilot."
- GitHub is in the process of creating "GitHub Copilot."
- Microsoft is also developing "Bing Copilot."
And our goal is to empower every SaaS product with the ability to have their own AI copilots tailored for their unique products.
- It is capable of calling your underlying APIs.
- It can transform the response into meaningful text.
- It can automatically populate certain request payload fields based on the context.
- For instance, you can request actions like: "Initiate a new case about X problem," and the title field will be automatically filled with the appropriate name.
- Currently, it does not support calling multiple endpoints simultaneously (feature coming soon).
- It is not suitable for handling large APIs.
- It is not equipped to handle complex APIs.
- It can not remember the chat history (every message is agnostic from previous messages.)
- Create unlimited copilots.
- Embed the copilot on your SaaS product using standard JS calls.
- TypeScript chat bubble.
- Provide Swagger definitions for your APIs.
- Swagger definition validator + recommender.
- UI endpoints editor.
- Chat memory.
- Vector DB support for large Swagger files.
- Plugins system to support different types of authentications.
- Offline LLMs.
- Ability to ingest text data, PDF files, websites, and extra data sources.
We love hearing from you! Got any cool ideas or requests? We're all ears! So, if you have something in mind, give us a shout!
- OpenCopilot Flows Editor
- The backend server (API) is reacheable via http://localhost:8888/backend
- The dashboard server is reachable via http://localhost:8888/ or http://localhost:8888/dashboard
This project follows the all-contributors specification. Contributions of any kind welcome!