Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: role with null value added to the history on content blocked due to Safety reasons #197

Open
Flucadetena opened this issue Jul 26, 2024 · 2 comments · Fixed by #205

Comments

@Flucadetena
Copy link
Contributor

Flucadetena commented Jul 26, 2024

Description of the bug:

When using the chat mode and sending messages instead of streaming (using the method sendMessage instead of sendMessageStream) , the blocked responses from the ai will be added to the _history with a null role. Throwing an exception and breaking the chatSession as the multi-turn mode will force the conversation to alternate between user and model.

This normally throughs this Exception:

I/flutter (29978): ----------------FIREBASE CRASHLYTICS----------------
I/flutter (29978): The following exception was thrown Init App Services Fatal error:
I/flutter (29978): Please use a valid role: user, model.
I/flutter (29978): 
I/flutter (29978): #0      parseGenerateContentResponse (package:google_generative_ai/src/api.dart:583:54)
I/flutter (29978): <asynchronous suspension>
I/flutter (29978): #1      ChatSession.sendMessage (package:google_generative_ai/src/chat.dart:67:24)
I/flutter (29978): <asynchronous suspension>
I/flutter (29978): #2      ChatSession.sendMessage.<anonymous closure> (package:firebase_vertexai/src/vertex_chat.dart:71:15)
I/flutter (29978): <asynchronous suspension>
I/flutter (29978): ---------------------------------------------------- 

Actual vs expected behavior:

If a responses is flagged and blocked, it should not be added to the history allowing the developer to handle the error and either retry the generation or add a generic response from the ai to continue the conversation naturally.

Any other information you'd like to share?

No response

@Flucadetena Flucadetena changed the title BUG: role with null value added to the history. BUG: role with null value added to the history on content blocked due to Safety reasons Jul 26, 2024
Flucadetena added a commit to Flucadetena/generative-ai-dart that referenced this issue Jul 26, 2024
…null role added to the history of the ChatSession when content was blocked due to Safety or other reasons.

closes [issue google-gemini#197](google-gemini#197 (comment))
Flucadetena added a commit to Flucadetena/generative-ai-dart that referenced this issue Jul 26, 2024
…null role added to the history of the ChatSession when content was blocked due to Safety or other reasons.

closes issue google-gemini#197
natebosch added a commit that referenced this issue Aug 16, 2024
Fixes #197

A content requires a role when it is sent to the model in the history.
If the backend happens to respond with a message that has no role,
default it to 'model'.
natebosch added a commit that referenced this issue Aug 16, 2024
Fixes #197

A content requires a role when it is sent to the model in the history.
If the backend happens to respond with a message that has no role,
default it to 'model'.
@Flucadetena
Copy link
Contributor Author

Awesome @natebosch thank you for the update.

@Flucadetena
Copy link
Contributor Author

Hi team,

The issue is not solve with this fix as the main problem is the lack of answer from the model in the history.

You can use this example repo I created and follow the instructions and you will se the chat breaks completely. firebase/flutterfire#13104

The only two solutions I can think of, that I currently use in my own code are:

  • Give the option to the developer to pass a default message in case the content gets blocked. This would solve the problem of having a "user>model>user>model" as multi-turn requests.
  • Before every generation, check the history and join any content that comes from the same role. This is what Studio AI and the JS package do.

Hope it helps, thanks for the great job.

@Flucadetena Flucadetena reopened this Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant