Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot call CompleteChatAsync with options after ToolChatMessage in messages of type List<ChatMessage> #218

Open
3 tasks done
taihuy opened this issue Sep 19, 2024 · 7 comments
Assignees
Labels
bug Something isn't working

Comments

@taihuy
Copy link

taihuy commented Sep 19, 2024

Confirm this is not an issue with the OpenAI Python Library

  • This is not an issue with the OpenAI Python Library

Confirm this is not an issue with the underlying OpenAI API

  • This is not an issue with the OpenAI API

Confirm this is not an issue with Azure OpenAI

  • This is not an issue with Azure OpenAI

Describe the bug

I really like the idea of this codesnip, but it seems like the client.CompleteChat(messages, options); throws exception from server without any specific errors (only 400 Bad Request has been returned), if we call this method after the ToolCall were processed in the previous loop. If we remove options and only keep messages without options CompleteChat(messages), and again after ToolCall, it would work fine.

do
{
    requiresAction = false;
    ChatCompletion chatCompletion = client.CompleteChat(messages, options);

    switch (chatCompletion.FinishReason)
    ...
while (requiresAction)

in some forums, I see that they suggest to use CompleteChat with options for the first time, and then go through the tool before call the CompleteChat for the last time without options. In this case how we can include the whole history? For example when the bot need a parameter from user and ask a question before the bot can run function further?
On other words, how can we process tool calls as chain (not parallel), output from the first tool can be input for the next tool.

To Reproduce

Just run the codesnip and make sure that the model runs at least one tool. You will see that it throws exception.

do
{
    requiresAction = false;
    ChatCompletion chatCompletion = client.CompleteChat(messages, options);

    switch (chatCompletion.FinishReason)
    ...
while (requiresAction)

The error I have gotten:

System.ClientModel.ClientResultException: Service request failed.
      Status: 400 (Bad Request)
      
         at Azure.AI.OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
         at Azure.AI.OpenAI.Chat.AzureChatClient.CompleteChatAsync(BinaryContent content, RequestOptions options)
         at OpenAI.Chat.ChatClient.CompleteChatAsync(IEnumerable`1 messages, ChatCompletionOptions options, CancellationToken cancellationToken)

Code snippets

No response

OS

Windows

.NET version

8.0.6

Library version

2.0.0-beta.2

@taihuy taihuy added the bug Something isn't working label Sep 19, 2024
@joseharriaga
Copy link
Collaborator

Thank you for reaching out, @taihuy ! This snippet is from our function calling example linked below, correct?
🔗 https://github.com/openai/openai-dotnet/blob/main/examples/Chat/Example03_FunctionCalling.cs

I just ran the example using the latest version of the library, and it works as expected. Did you make any modifications to the code? If you could share an end-to-end repro, that would be very helpful!

@taihuy
Copy link
Author

taihuy commented Sep 20, 2024

Hi @joseharriaga,

Thanks very much for your quick answer. I really appreciate it. It is even more helpful when we work with this kind of innovative technology and still in beta test.
I figured why it happened with my code. It were because I used data source in addition to tools in my ChatCompletionOptions. When I removed the data source, the code worked well as the example.

var searchDataSource = new AzureSearchChatDataSource()
       {
           Endpoint = new Uri(searchEndpoint),
           IndexName = searchIndexName,
           Authentication = DataSourceAuthentication.FromApiKey(searchApiKey),
       };

var options = new ChatCompletionOptions
       {
           Tools = { _projectsInCompanyTool, _timeEntriesInProjectTool, _timeEntriesInCompanyTool, _usersInCompanyTool}, ToolChoice = ChatToolChoice.Auto
       };

#pragma warning disable AOAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates.
       // options.AddDataSource(searchDataSource);
#pragma warning restore AOAI001

I got the following warning "Azure.AI.OpenAI.AzureChatCompletionOptionsExtensions.AddDataSource(OpenAI.Chat.ChatCompletionOptions, Azure.AI.OpenAI.Chat.AzureChatDataSource)' is for evaluation purposes only and is subject to change or removal in future updates.", but I need both tools and Azure Search Service in my bot. Is there any ways that I can have both?

@dmytrostruk
Copy link

@joseharriaga I have the same issue when using AzureSearchChatDataSource and function calling together. The response is 400 (Bad Request) Invalid chat message detected: message content must be string.

I noticed that it's related to AssistantChatMessage serialization - it works differently depending on content parameter.

I tried it with following test code snippet:

var messages = new List<ChatMessage>
{
    new AssistantChatMessage(toolCalls: [], content: null),
    new AssistantChatMessage(toolCalls: [], content: string.Empty),
    new AssistantChatMessage(toolCalls: [], content: "test")
};

var result = await client.CompleteChatAsync(messages);

This is how the request looks like:

{
   "messages":[
      {
         "role":"assistant"
      },
      {
         "role":"assistant",
         "content":[
            {
               "type":"text",
               "text":""
            }
         ]
      },
      {
         "role":"assistant",
         "content":"test"
      }
   ],
   "model":"gpt-4o"
}
  1. When content is null - content property won't be present in request.
  2. When content is an empty string - content property in request will be an array of objects with "type": "text" and "text": "".
  3. When content is non-empty string - content property in request will be a string.

It looks like Azure OpenAI with data service works only when content property is a string, not an array, and it should exist in request body, even when it's null (in this case it should be empty). I'm not sure where exactly this fix should be applied, but I'm wondering if serialization logic on OpenAI SDK side should be updated for case with empty string to send "content":"" or it should remain as "content":[{"type":"text", "text": ""}]?

@joseharriaga
Copy link
Collaborator

Thank you for providing more context! Unfortunately, the Azure OpenAI service does not support function calling in combination with data sources. You can find more information about it here:
🔗 https://learn.microsoft.com/azure/ai-services/openai/concepts/use-your-data?tabs=ai-search%2Ccopilot#function-calling

@joseharriaga joseharriaga self-assigned this Oct 3, 2024
@dmytrostruk
Copy link

Thank you for providing more context! Unfortunately, the Azure OpenAI service does not support function calling in combination with data sources. You can find more information about it here: 🔗 https://learn.microsoft.com/azure/ai-services/openai/concepts/use-your-data?tabs=ai-search%2Ccopilot#function-calling

@joseharriaga It's not supported, but in this case the request should still be successful and either function calling, or data sources should be ignored. But taking into account the serialization aspect of assistant messages that I shared above, instead of ignoring function calling/data source configuration, it throws 400 (Bad Request) Invalid chat message detected: message content must be string.

@joseharriaga
Copy link
Collaborator

joseharriaga commented Oct 3, 2024

I'm not sure where exactly this fix should be applied, but I'm wondering if serialization logic on OpenAI SDK side should be updated for case with empty string to send "content":"" or it should remain as "content":[{"type":"text", "text": ""}]?

@dmytrostruk Ah! We merged and released a change where an empty string is represented as "content": "" instead of an array of content parts starting with version 2.0.0-beta.13.

@dmytrostruk
Copy link

I'm not sure where exactly this fix should be applied, but I'm wondering if serialization logic on OpenAI SDK side should be updated for case with empty string to send "content":"" or it should remain as "content":[{"type":"text", "text": ""}]?

@dmytrostruk Ah! We merged and released a change where an empty string is represented as "content": "" instead of an array of content parts starting with version 2.0.0-beta.13.

@joseharriaga Great news, thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants