Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maintain support for legacy 'max_tokens' property (for non-OpenAI APIs, like Groq, Mistral) #228

Open
1 task done
kirk-marple opened this issue Sep 27, 2024 · 1 comment

Comments

@kirk-marple
Copy link

Confirm this is a feature request for the .NET library and not the underlying OpenAI API

  • This is a feature request for the .NET library

Describe the feature or improvement you are requesting

LLM providers like Groq, Mistral and Cerebras have OpenAI compatible APIs, but these still use the legacy max_tokens field.

It would be great if this library could be backward compatible with these legacy APIs. Right now, we can't use the latest SDK with those APIs since they error out if you sent max_completion_tokens.

 > ChatCompletionOptions will automatically apply its MaxOutputTokenCount value (renamed from MaxTokens) to the new max_completion_tokens request body property

Additional context

No response

@JadynWong
Copy link

JadynWong commented Sep 28, 2024

I use this library for OpenAI, Azure OpenAI and Azure AI Intenerce.
Azure AI Intenerce reports errors when passing certain unsupported parameters.
Currently, I am using reflection to set _deprecatedMaxTokens and setting null for StreamOptions.

using OpenAIChatCompletionOptions = OpenAI.Chat.ChatCompletionOptions;
using OpenAIChatToolChoice = OpenAI.Chat.ChatToolChoice;

public static class OpenAISdkHelper
{
    [UnsafeAccessor(UnsafeAccessorKind.Method, Name = "set__deprecatedMaxTokens")]
    public static extern void SetMaxTokens(OpenAIChatCompletionOptions options, int? deprecatedMaxTokens);

    [UnsafeAccessor(UnsafeAccessorKind.Field, Name = "_predefinedValue")]
    public static extern ref string? GetToolChoicePredefinedValue(OpenAIChatToolChoice toolChoice);

    private static PropertyInfo? StreamOptionsProperty { get; } = typeof(OpenAIChatCompletionOptions).GetProperty("StreamOptions", BindingFlags.NonPublic | BindingFlags.Instance)!;

    public static void SetStreamOptionsToNull(OpenAIChatCompletionOptions options)
    {
        StreamOptionsProperty?.SetValue(options, null);
    }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants