This package is still in WIP 🛠️ stages, so the interface may be subject to change. With that said, the package currently supports:
- 🚀 NEW! Support for many APIs:
- Anthropic
- OpenAI
- OpenAI compliant, with a custom URL
- (Includes local providers, such as Ollama or LM Studio)
- OpenRouter, with support for hundreds of models, and many OpenRouter exclusive features.
- Creating and parsing ChatML style requests and responses.
- Creating and parsing raw prompt style requests and responses.
- Formatting prompt strings with
[INST]
,<<SYS>>
, and<s>
tags for models with llama style fine-tuning. - Most common LLM parameters such as
temperature
,top_p
,top_k
,repetition_penalty
, etc. - OpenRouter specific features like fallback models and provider preferences.
- LLM tool use - this enables the AI model to call Roc functions and use the results in its answers.
- Includes a collection of prebuilt tools, or you can build your own
- Prompt caching on supported models
- Tool use is currently not supported with the anthropic API,
- this is due to missing support in Roc for decoding json dictionaries roc#5294
- Workaround: Anthropic models can be accessed through OpenRouter, with full tool calling support
- Prompt caching has currently only be been tested through OpenRouter
main! = |_|
api_key = Env.var!("OPENAI_API_KEY")?
client =
Chat.new_client({ api: OpenAI, api_key, model: "gpt-4o" })
|> Chat.append_user_message("Hello, computer!", {})
response = Http.send!(Chat.build_http_request(client, {}))?
messages = Chat.update_messages(client, response)? |> .messages
when List.last(messages) is
Ok(message) -> Stdout.line!(message.content)
_ -> Ok({})
For complete example apps, including a full chatbot app with tool use, see the examples folder.