Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Fix recursion error when using streaming with tools #81

Closed
wants to merge 3 commits into from

Conversation

munday-tech
Copy link

Added additional param to allow independent turning off for streaming to ensure recursion errors don't occur when calling generate from stream with streaming=True on BedrockChat

@munday-tech
Copy link
Author

Fixes #80

@laithalsaadoon
Copy link
Contributor

@3coins @bigbernnn can you please have a look? I confirmed this PR is working

Copy link
Contributor

@laithalsaadoon laithalsaadoon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@bigbernnn
Copy link
Contributor

@munday-tech Looks good! Please fix small linting error.

@munday-tech
Copy link
Author

@bigbernnn Should be fixed now.

@@ -439,7 +439,7 @@ def _stream(
if "claude-3" in self._get_model():
if _tools_in_params({**kwargs}):
result = self._generate(
messages, stop=stop, run_manager=run_manager, **kwargs
messages, stop=stop, run_manager=run_manager, stream=False, **kwargs

This comment was marked as outdated.

@deepakdalakoti
Copy link

Is there any update on this? I am facing the same issue

Copy link
Collaborator

@3coins 3coins left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@munday-tech
Thanks for submitting this fix. Added suggestion to avoid adding a new parameter to _generate. Can you try this, and check if this works.

Comment on lines 441 to +442
result = self._generate(
messages, stop=stop, run_manager=run_manager, **kwargs
messages, stop=stop, run_manager=run_manager, stream=False, **kwargs
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rather than adding a new parameter, can try adding to the kwargs.

kw = {**kwargs, "stream": False}
result = self._generate(
    messages, stop=stop, run_manager=run_manager, **kw
)

@@ -497,19 +497,23 @@ def _generate(
messages: List[BaseMessage],
stop: Optional[List[str]] = None,
run_manager: Optional[CallbackManagerForLLMRun] = None,
stream: Optional[bool] = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
stream: Optional[bool] = None,

**kwargs: Any,
) -> ChatResult:
should_stream = stream if stream is not None else self.streaming
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
should_stream = stream if stream is not None else self.streaming
should_stream = (
kwargs.pop("stream", self.streaming) if kwargs else self.streaming
)

@deepakdalakoti
Copy link

I did a quick test with this solution. While this fixes the recursion bug, the tokens are still not being streamed (for claude-3 models)

@rsgrewal-aws
Copy link

@munday-tech could you review the changes requested and also confirm if the tokens get streamed with this fix ?

@3coins
Copy link
Collaborator

3coins commented Jul 23, 2024

Working on another PR, that allows streaming to work as expected. The change here uses interplay between _stream and _generate and does not return the chunks independently. See #119

@3coins
Copy link
Collaborator

3coins commented Oct 7, 2024

Fixed by #119

@3coins 3coins closed this Oct 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants