Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplifying the OpenAI provider to use multiple model providers #1248

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

srdas
Copy link
Collaborator

@srdas srdas commented Feb 17, 2025

Description

The OpenAI model interface has been widely adopted by many model providers (DeepSeek, vLLM, etc.) and this PR enables accessing these models using the OpenAI provider. Current OpenAI models are also accessible via the same interface.

This PR also updates related documentation on the use of these models that work via the OpenAI provider.

Demo

See the new usage of models and the required settings shown below, note the new "OpenAI::general interface":
image

For any OpenAI model:
openai-chat-openai

For DeepSeek models:
openai-chat-deepseek

For models deployed with vLLM:
openai-chat-vllm

Code Completion

This is also supported now as follows:
image

@srdas srdas added the enhancement New feature or request label Feb 17, 2025
@srdas srdas marked this pull request as ready for review February 17, 2025 23:41
@dlqqq dlqqq marked this pull request as draft February 18, 2025 19:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for embedding models served through an OpenAI API
1 participant