You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
Here are some suggestions that I've gathered while using the vendors-gem:
Claude/Anthropic/Bedrock
The user is permitted to specify a temperature value greater than 0 and less than 1. If the model is prompted with a flawed configuration, a pop-up appears. Considering that temperature is a parameter strictly bound between 0 and 1, perhaps implementing a slider would be effective? Same with Top_P
Setting stop sequence results in Error: Malformed input request: #: extraneous key [stop_sequence] is not permitted, please reformat your input and try again.
anthropic Model Ids available at bedrock (🟢 - works, 🔴 does not work)
🟢 anthropic.claude-instant-v1
🟢 anthropic.claude-v2
🟢 anthropic.claude-v2:1
🔴 anthropic.claude-3-haiku-20240307-v1:0 Error: Could not resolve the foundation model from the provided model identifier.
🔴 anthropic.claude-3-sonnet-20240229-v1:0 Error: Could not resolve the foundation model from the provided model identifier.
**The prompt format changes between the v2 and 3 model. Consider changing the text box to a list of supported models (or two lists: first list for choosing anthropic/cohere/amazon/mistral, second list for choosing supported model) ** link (see #API request)
As previously mentioned, different models hosted on AWS Bedrock may utilize different prompt formats. Therefore, naming the requester 'AWS SDK Requester component' might be misleading, especially when it's actually a Claude/Anthropic requester. Additionally, the Claude/Anthropic requester currently does not support all available Anthropic models.
However, I believe that if the requester is enhanced with different configurations for various vendors and models, changing the name may not be necessary.
Consider using list with allowed keyboard input for Region Name
I find setting model Id in requester odd, especially when there is another tab called 'Model Configurations' which in case of ollama holds both model name and parameters. I believe that replacing requesters with 'Backend' might be a sensible option.
AWS Backend
Region Name
Ollama Backend
Endpoint ip:port/
Alternatively list of choices (/api/chat; api/generate)
Ollama
Naming. BasicJSONRequesterComponent. Is this component reusable for other backends/vendors?
Hello,
Here are some suggestions that I've gathered while using the vendors-gem:
Claude/Anthropic/Bedrock
The user is permitted to specify a temperature value greater than 0 and less than 1. If the model is prompted with a flawed configuration, a pop-up appears. Considering that temperature is a parameter strictly bound between 0 and 1, perhaps implementing a slider would be effective? Same with Top_P
Setting stop sequence results in
Error: Malformed input request: #: extraneous key [stop_sequence] is not permitted, please reformat your input and try again.
anthropic Model Ids available at bedrock (🟢 - works, 🔴 does not work)
Error: Could not resolve the foundation model from the provided model identifier.
Error: Could not resolve the foundation model from the provided model identifier.
As previously mentioned, different models hosted on AWS Bedrock may utilize different prompt formats. Therefore, naming the requester 'AWS SDK Requester component' might be misleading, especially when it's actually a Claude/Anthropic requester. Additionally, the Claude/Anthropic requester currently does not support all available Anthropic models.
However, I believe that if the requester is enhanced with different configurations for various vendors and models, changing the name may not be necessary.
Consider using list with allowed keyboard input for Region Name
I find setting model Id in requester odd, especially when there is another tab called 'Model Configurations' which in case of ollama holds both model name and parameters. I believe that replacing requesters with 'Backend' might be a sensible option.
Ollama
Naming.
BasicJSONRequesterComponent
. Is this component reusable for other backends/vendors?What is "Content type" in BasicJSONRequesterComponent confiuration?
https://github.com/RobotecAI/ai-vendors-gems/blob/0cafcfa0cc8056d9cbb44eeef45837613264a1b9/Gems/GenAIOllama/Code/Source/Communication/JSONHttp/BasicJSONRequesterComponent.cpp#L43-L47
Consider changing the following into
"An url with port pointing to an HTTP endpoint (format address:port/endpoint_url)"
https://github.com/RobotecAI/ai-vendors-gems/blob/0cafcfa0cc8056d9cbb44eeef45837613264a1b9/Gems/GenAIOllama/Code/Source/Communication/JSONHttp/BasicJSONRequesterComponent.cpp#L42
Model configuration for ollama models
The text was updated successfully, but these errors were encountered: