-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: Fix recursion error when using streaming with tools #81
Conversation
Fixes #80 |
@3coins @bigbernnn can you please have a look? I confirmed this PR is working |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@munday-tech Looks good! Please fix small linting error. |
@bigbernnn Should be fixed now. |
@@ -439,7 +439,7 @@ def _stream( | |||
if "claude-3" in self._get_model(): | |||
if _tools_in_params({**kwargs}): | |||
result = self._generate( | |||
messages, stop=stop, run_manager=run_manager, **kwargs | |||
messages, stop=stop, run_manager=run_manager, stream=False, **kwargs |
This comment was marked as outdated.
This comment was marked as outdated.
Sorry, something went wrong.
Is there any update on this? I am facing the same issue |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@munday-tech
Thanks for submitting this fix. Added suggestion to avoid adding a new parameter to _generate
. Can you try this, and check if this works.
result = self._generate( | ||
messages, stop=stop, run_manager=run_manager, **kwargs | ||
messages, stop=stop, run_manager=run_manager, stream=False, **kwargs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rather than adding a new parameter, can try adding to the kwargs.
kw = {**kwargs, "stream": False}
result = self._generate(
messages, stop=stop, run_manager=run_manager, **kw
)
@@ -497,19 +497,23 @@ def _generate( | |||
messages: List[BaseMessage], | |||
stop: Optional[List[str]] = None, | |||
run_manager: Optional[CallbackManagerForLLMRun] = None, | |||
stream: Optional[bool] = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
stream: Optional[bool] = None, |
**kwargs: Any, | ||
) -> ChatResult: | ||
should_stream = stream if stream is not None else self.streaming |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should_stream = stream if stream is not None else self.streaming | |
should_stream = ( | |
kwargs.pop("stream", self.streaming) if kwargs else self.streaming | |
) |
I did a quick test with this solution. While this fixes the recursion bug, the tokens are still not being streamed (for claude-3 models) |
@munday-tech could you review the changes requested and also confirm if the tokens get streamed with this fix ? |
Working on another PR, that allows streaming to work as expected. The change here uses interplay between _stream and _generate and does not return the chunks independently. See #119 |
Fixed by #119 |
Added additional param to allow independent turning off for streaming to ensure recursion errors don't occur when calling
generate
fromstream
withstreaming=True
on BedrockChat