Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SelectorGroupChat not working with Gemini 2.0 Flash #5322

Closed
vishal-android-freak opened this issue Feb 2, 2025 · 3 comments · Fixed by #5334
Closed

SelectorGroupChat not working with Gemini 2.0 Flash #5322

vishal-android-freak opened this issue Feb 2, 2025 · 3 comments · Fixed by #5334
Assignees
Milestone

Comments

@vishal-android-freak
Copy link

vishal-android-freak commented Feb 2, 2025

What happened?

Getting an error as soon as I run the main file

Traceback (most recent call last):
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/main.py", line 15, in <module>
    asyncio.run(main())
  File "/Users/naarang/.asdf/installs/python/3.11.3/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Users/naarang/.asdf/installs/python/3.11.3/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/.asdf/installs/python/3.11.3/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/main.py", line 11, in main
    await Console(team.run_stream(task="Draft an email for me to send to my boss."))
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_agentchat/ui/_console.py", line 117, in Console
    async for message in stream:
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_base_group_chat.py", line 420, in run_stream
    await self._runtime.send_message(
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_core/_single_threaded_agent_runtime.py", line 328, in send_message
    return await future
           ^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_core/_single_threaded_agent_runtime.py", line 417, in _process_send
    response = await recipient_agent.on_message(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 48, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 485, in on_message_impl
    return await h(self, message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_core/_routed_agent.py", line 389, in wrapper
    return_value = await func(self, message, ctx)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_base_group_chat_manager.py", line 108, in handle_start
    speaker_topic_type = await speaker_topic_type_future
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_agentchat/teams/_group_chat/_selector_group_chat.py", line 139, in select_speaker
    response = await self._model_client.create(messages=select_speaker_messages)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/autogen_ext/models/openai/_openai_client.py", line 518, in create
    result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
                                                                     ^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1727, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1849, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1543, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/naarang/projects/paisa/paisa-backend-autogen/paisa_autogen/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1644, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - [{'error': {'code': 400, 'message': '* GenerateContentRequest.contents: contents is not specified\n', 'status': 'INVALID_ARGUMENT'}}]

What did you expect to happen?

Working as the other models

How can we reproduce it (as minimally and precisely as possible)?

agents.py

from autogen_agentchat.agents import AssistantAgent, UserProxyAgent
from models import model_client

user_agent = UserProxyAgent(name="user_confirmation_agent", description="You are a user confirmation agent to get confirmation from the user for the input provided by the assistant.")

assistant_agent = AssistantAgent(name="assistant_agent", system_message="You are an assistant to get the input from the user.", model_client=model_client)

models.py

from autogen_core.models import ModelFamily
from dotenv import load_dotenv
load_dotenv()
from autogen_ext.models.openai import OpenAIChatCompletionClient
import os

model_client = OpenAIChatCompletionClient(
    model="gemini-2.0-flash-exp",
    base_url="https://generativelanguage.googleapis.com/v1beta/openai/",
    api_key=os.environ.get("GEMINI_API_KEY"),
    model_info={
        "family": "unknown",
        "function_calling": True,
        "json_output": True,
        "vision": False
    }
)

main.py

from autogen_agentchat.teams import SelectorGroupChat
from agents import user_agent, assistant_agent
from models import model_client
from autogen_agentchat.conditions import TextMentionTermination
from autogen_agentchat.ui import Console
import asyncio

async def main():
    team = SelectorGroupChat([user_agent, assistant_agent], model_client=model_client, termination_condition=TextMentionTermination("STOP"))

    await Console(team.run_stream(task="Draft an email for me to send to my boss."))


if __name__ == "__main__":
    asyncio.run(main())

AutoGen version

0.4.5

Which package was this bug in

AgentChat

Model used

gemini-2.0-flash-exp, gemini-1.5-flash

Python version

3.11.3

Operating system

macOS 15.1

Any additional info you think would be helpful for fixing this bug

No response

@ekzhu
Copy link
Collaborator

ekzhu commented Feb 3, 2025

Thanks for the issue. PR #5334

@vishal-android-freak
Copy link
Author

Thanks for the quick fix @ekzhu I have added a small review in the PR :)

@vishal-android-freak
Copy link
Author

Nevermind, I read the code. Looks good :)

ekzhu added a commit that referenced this issue Feb 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants