Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LiteLLM - support for Vertex AI, Gemini, Anthropic, Bedrock (100+LLMs) #552

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

ishaan-jaff
Copy link

@ishaan-jaff ishaan-jaff commented Jan 1, 2025

Add LiteLLM - support for Vertex AI, Gemini, Anthropic, Bedrock (100+LLMs)

What's changing

This PR adds support for the above mentioned LLMs using LiteLLM https://github.com/BerriAI/litellm/
LiteLLM is a lightweight package to simplify LLM API calls - use any llm as a drop in replacement for gpt-4o.

Example

from litellm import completion
import os

## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-openai-key"
os.environ["ANTHROPIC_API_KEY"] = "your-cohere-key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="openai/gpt-4o", messages=messages)

# anthropic call
response = completion(model="anthropic/claude-3-sonnet-20240229", messages=messages)
print(response)

Response (OpenAI Format)

{
    "id": "chatcmpl-565d891b-a42e-4c39-8d14-82a1f5208885",
    "created": 1734366691,
    "model": "claude-3-sonnet-20240229",
    "object": "chat.completion",
    "system_fingerprint": null,
    "choices": [
        {
            "finish_reason": "stop",
            "index": 0,
            "message": {
                "content": "Hello! As an AI language model, I don't have feelings, but I'm operating properly and ready to assist you with any questions or tasks you may have. How can I help you today?",
                "role": "assistant",
                "tool_calls": null,
                "function_call": null
            }
        }
    ],
    "usage": {
        "completion_tokens": 43,
        "prompt_tokens": 13,
        "total_tokens": 56,
        "completion_tokens_details": null,
        "prompt_tokens_details": {
            "audio_tokens": null,
            "cached_tokens": 0
        },
        "cache_creation_input_tokens": 0,
        "cache_read_input_tokens": 0
    }
}

Provide a clear and concise description of the content changes you're proposing. List all the
changes you are making to the content.

  • Updated section...
  • Added new...
  • Removed outdated information from...
  • Fixed a typo in...
  • ...

If this PR is related to an issue or closes one, please link it here.

Refs #...
Closes #...

How to test it

Steps to test the changes:

Additional notes for reviewers

Anything you'd like to add to help the reviewer understand the changes you're proposing.

I already...

  • Tested the changes in a working environment to ensure they work as expected
  • Added some tests for any new functionality
  • Updated the documentation (both comments in code and product documentation under /docs)
  • Checked if a (backend) DB migration step was required and included it if required

@github-actions github-actions bot added backend api Changes which impact API/presentation layer labels Jan 1, 2025
@ishaan-jaff
Copy link
Author

Can I get a review @njbrake #552

@ividal
Copy link
Contributor

ividal commented Jan 2, 2025

Hi @ishaan-jaff , thanks for the contribution! 🙌

Just wanted to give you an ack and (to be completely transparent) a heads-up that we're not quite open to feature contributions just yet. It's a bummer for us too and we're soon opening up.

We do love the idea of supporting a broader range of models through LiteLLM and want to see what we need to add (and remove!) from the code base as soon as we're back from the Winter break.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api Changes which impact API/presentation layer backend
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEATURE]: LiteLLM for LLM provider backends
2 participants