Replies: 2 comments 3 replies
-
Hey @CarnageXD! I'm here to help you with your issue. I'm a bot that can assist you with bug fixes, answer questions, and guide you on becoming a contributor. Let's tackle this together! I found a relevant discussion that might help you with your issue. In LangChain, when streaming from Anthropic models, the There are configuration options in LangChain that affect the presence of cache usage information in the If you need cache usage information during streaming, you might need to check if there are any updates or additional configurations in the LangChain library that could enable this feature. As of the information available, it seems that the current implementation might not support cache usage details in the |
Beta Was this translation helpful? Give feedback.
-
Experiencing the same. Anthropic says the following:
So it looks like the first response in the stream should have these parameters but I can't see them when I log the responses. It may not have been implemented yet in LangChain. |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
When I use Anthropic without streaming, I can get information about the cache usage from the model response from the "response_metadata" and the "usage" field inside it. Like this:
But when I use streaming, in this case, the "response_metadata" field is empty and the only way to check tokens usage is to look at the "usage_metadata" field, but it doesn't have information about the cache:
Also, when I log every single chunk of data, the result is the same – "response_metadata" is empty, and only the "usage_metadata" is the option to see, but it doesn't have cache usage info
Is this something missing from LangChain's implementation, or am I looking in the wrong place?
System Info
"@langchain/anthropic": "^0.3.5",
"@langchain/core": "^0.3.13",
"langchain": "^0.3.3"
node v20.11.0
Beta Was this translation helpful? Give feedback.
All reactions