Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[inference] fold openai support into provider param #1205

Merged
merged 12 commits into from
Feb 28, 2025

Conversation

julien-c
Copy link
Member

ie. no need to override a endpoint anymore

This only works in "client-side" mode ie when passing a provider key

WDYT?

@julien-c julien-c requested a review from Wauplin February 16, 2025 08:49
@@ -757,7 +757,7 @@ describe.concurrent("HfInference", () => {
const hf = new HfInference(OPENAI_KEY);
const stream = hf.chatCompletionStream({
provider: "openai",
model: "openai/gpt-3.5-turbo",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'm wondering if we should still namespace model names and strip it in the implem? (just to keep consistency with HF repos..) 🤷

Copy link
Contributor

@Wauplin Wauplin Feb 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find it a bit redundant but ok to switch if you prefer

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(what if they OS it at some point in the future? etc)

@Wauplin
Copy link
Contributor

Wauplin commented Feb 25, 2025

I addressed the comment but won't continue until we know which direction we want to take. Just let me know and I'll check the tests

@Wauplin Wauplin marked this pull request as ready for review February 27, 2025 14:27
@Wauplin Wauplin requested a review from gary149 February 27, 2025 14:27
@SBrandeis SBrandeis self-assigned this Feb 28, 2025
Copy link
Member Author

@julien-c julien-c left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i can't self-approve (even though it's basically only @Wauplin's work 😁) but IMO this is good to go 👍

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving it, though it doesn't count much ^^

@SBrandeis SBrandeis changed the title draft: fold openai support into provider param [inference] fold openai support into provider param Feb 28, 2025
@Wauplin Wauplin merged commit 822ab9e into main Feb 28, 2025
5 checks passed
@Wauplin Wauplin deleted the api-only-provider-mode branch February 28, 2025 15:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants