Replies: 2 comments 2 replies
-
resolved via openAI curl |
Beta Was this translation helpful? Give feedback.
0 replies
-
👍 json_object or json_schema be preferable, but we'd need a way to expose the mode. It can be configured via ml_model_list.py for built in models, but I don't have a method for custom models or openAI compatible servers yet. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm currently writing a tool calling proxy that sits between kiln and more popular constrained generation backends (that dont implement true function calling); is there an example of what payload kiln expects back from the API?
I tried just returning oAI spec:
as the response body, but that just threw a model_dump() error
thanks!
Beta Was this translation helpful? Give feedback.
All reactions