-
Notifications
You must be signed in to change notification settings - Fork 497
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement embedding model provider using LiteLLM #1078
Comments
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
Hello, is this being picked by someone. I am eagerly waiting for this to use litellm for openai compatible models |
@sidharth-deriv hi, would you like to contribute to this? |
I would love to but i dont know much about how to consume from LLM. i am interested in using wren ai using litellm |
@sidharth-deriv how about we schedule a time and I could introduce the task and codebase to you? could you join our discord server and DM me(Jimmy)? |
Hi @cyyeh |
Implemented, and will be released in the next version of Wren AI |
As of now, Wren AI already implements LiteLLM as LLM provider, so users could use any LLM supported by LiteLLM. However, we don't support embedding models supported by LiteLLM yet. After this feature is done, users could also use any embedding models supported by LiteLLM!
The text was updated successfully, but these errors were encountered: