You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to run a simple text inference against anthropic.claude-3-5-sonnet models. This is not a chat interaction and I would prefer not to use the Messages API. Which means I want to use Bedrock's InvokeModel API rather than the Converse API. (And their streaming equivalents)
As far as I can tell both BedrockChatConverse and BedrockChat now use AWS' converse API under the hood. BedrockLLMdoesn't support Claude v3 models.
Is there a way to use InvokeModel API via Langchain?
As an aside, the message structure required by the Converse API is too opinionated and rigid, and causes a lot of issues with Langgraph. The requirement that messages always alternate between user and assistant is a problem when langraph has several agent nodes adding messages in sequence without user input.
The text was updated successfully, but these errors were encountered:
ChatBedrock supports InvokeModel API. By default, the value of beta_use_converse_api flag is False and Invoke API is called internally. If a user passes this as True, then ChatBedrock switches to using Converse API.
Following is a code snippet that utilizing ChatBedrock to invoke anthropic.claude-3-5-sonnet-20241022-v2:0:
fromlangchain_awsimportChatBedrockdefmain():
llm=ChatBedrock(
model_id='us.anthropic.claude-3-5-sonnet-20241022-v2:0',
region_name="us-east-1",
model_kwargs={
"max_tokens": 100,
"top_p": 0.9,
"temperature": 0.1,
},
)
messages= [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love going out for a walk when the weather is bright and sunny.")]
# Invoke the llmresponse=llm.invoke(messages)
print(response.content)
if__name__=='__main__':
main()
Output:
J'aime me promener quand il fait beau et ensoleillé.
Note: The model specified might not be available in all regions. Check Model Availability page of AWS docs.
Notice, I have region_name as us-east-1 but the model is not available in the stated region as per above link. To be able to use the latest Sonnet 3.5 from the us-east-1 region, the model inference profile id is 'us.anthropic.claude-3-5-sonnet-20241022-v2:0', it will route the inference to the region where the model is hosted. Similar thread here
I want to run a simple text inference against
anthropic.claude-3-5-sonnet
models. This is not a chat interaction and I would prefer not to use the Messages API. Which means I want to use Bedrock'sInvokeModel
API rather than theConverse
API. (And their streaming equivalents)As far as I can tell both BedrockChatConverse and BedrockChat now use AWS' converse API under the hood.
BedrockLLM
doesn't support Claude v3 models.Is there a way to use
InvokeModel
API via Langchain?As an aside, the message structure required by the Converse API is too opinionated and rigid, and causes a lot of issues with Langgraph. The requirement that messages always alternate between user and assistant is a problem when langraph has several agent nodes adding messages in sequence without user input.
The text was updated successfully, but these errors were encountered: