-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LiteLLM Example in mistral docs wrong #3024
Comments
@ishaan-jaff, the example is correct. You are sharing an example about how to use Mistral's inference platform, but this is Azure AI. Does it make sense? |
any reason why you could not use it like this ? @santiagxf ? (i'm the maintainer of litellm) This looks a lot easier to me and it can go to to an Azure AI endpoint from litellm import completion
import os
os.environ['MISTRAL_API_KEY'] = ""
response = completion(
model="mistral/mistral-tiny",
api_base="your-api-base",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response) |
Hi @santiagxf I just deployed on Azure AI studio and I was able to run inference with this code If possible can we update the python notebook with the following code ? It uses the standard format on litellm docs: https://docs.litellm.ai/docs/providers/azure_ai Happy to make a PR for this too from litellm import completion
import os
response = completion(
model="mistral/Mistral-large-dfgfj",
api_base="https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/v1",
api_key = "JGbKodRcTp****"
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response) |
We tried your example, but it looks the POST Request Sent from LiteLLM:
curl -X POST \
https://api.mistral.ai/v1/ \
-d '{'model': 'Mistral-large-dfgfj', 'messages': [{'role': 'user', 'content': 'hello from litellm'}], 'extra_body': {}}' Is this something you can fix? @ishaan-jaff |
Yes we fixed this today: BerriAI/litellm#2216 @santiagxf Thanks for raising this |
I just had a look and notice that it requires Also, I spotted this line: Does it means |
Nope, that line ensure we use
Do you mean litellm the package should append a |
I think you're right about this actually, going to take a look at this Summarizing next steps from litellm
Does this sound good @santiagxf ? Should I make a PR to this repo once this is fixed |
It sounds good to me. The URL thing is not a big deal, but I think it would be nice to use the same approach the official client uses. Thanks for taking care of this! Once the new version of the library is out we can update this doc. Feel free to use this issue. I will keep it opened so I can remember to do it. |
Hi, |
@hsleiman1 noted - will add this as an improvement too |
Tracking this here @hsleiman1 BerriAI/litellm#2237 |
Fixes are here : BerriAI/litellm#2247 @hsleiman1 @santiagxf, will update once a new release is out |
@ishaan-jaff we still see the same issues happening. I added the comment in the issue you created on your repo: |
Operating System
MacOS
Version Information
not relevant
Steps to reproduce
https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/mistral/litellm.ipynb
@santiagxf Thanks for showing an example with litellm, but the docs are wrong
Here's how to use litellm with mistral
https://docs.litellm.ai/docs/providers/mistral
Expected behavior
Actual behavior
Addition information
No response
The text was updated successfully, but these errors were encountered: