Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Future work suggestions #45

Open
maciejmajek opened this issue Apr 4, 2024 · 1 comment
Open

Future work suggestions #45

maciejmajek opened this issue Apr 4, 2024 · 1 comment

Comments

@maciejmajek
Copy link
Member

maciejmajek commented Apr 4, 2024

Hello,
Here are some suggestions that I've gathered while using the vendors-gem:

Claude/Anthropic/Bedrock

  • The user is permitted to specify a temperature value greater than 0 and less than 1. If the model is prompted with a flawed configuration, a pop-up appears. Considering that temperature is a parameter strictly bound between 0 and 1, perhaps implementing a slider would be effective? Same with Top_P

  • Setting stop sequence results in Error: Malformed input request: #: extraneous key [stop_sequence] is not permitted, please reformat your input and try again.

  • anthropic Model Ids available at bedrock (🟢 - works, 🔴 does not work)

    • 🟢 anthropic.claude-instant-v1
    • 🟢 anthropic.claude-v2
    • 🟢 anthropic.claude-v2:1
    • 🔴 anthropic.claude-3-haiku-20240307-v1:0 Error: Could not resolve the foundation model from the provided model identifier.
    • 🔴 anthropic.claude-3-sonnet-20240229-v1:0 Error: Could not resolve the foundation model from the provided model identifier.
    • **The prompt format changes between the v2 and 3 model. Consider changing the text box to a list of supported models (or two lists: first list for choosing anthropic/cohere/amazon/mistral, second list for choosing supported model) ** link (see #API request)
  • As previously mentioned, different models hosted on AWS Bedrock may utilize different prompt formats. Therefore, naming the requester 'AWS SDK Requester component' might be misleading, especially when it's actually a Claude/Anthropic requester. Additionally, the Claude/Anthropic requester currently does not support all available Anthropic models.
    However, I believe that if the requester is enhanced with different configurations for various vendors and models, changing the name may not be necessary.

  • Consider using list with allowed keyboard input for Region Name

  • I find setting model Id in requester odd, especially when there is another tab called 'Model Configurations' which in case of ollama holds both model name and parameters. I believe that replacing requesters with 'Backend' might be a sensible option.

    • AWS Backend
      • Region Name
    • Ollama Backend
      • Endpoint ip:port/
      • Alternatively list of choices (/api/chat; api/generate)

Ollama

@jhanca-robotecai jhanca-robotecai transferred this issue from another repository Apr 11, 2024
@jhanca-robotecai
Copy link
Collaborator

I transferred this issue from ai-vendor-gems repo due to the code transfer. All comments in this issue should be addressed in ai-core-gem repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants