We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
感谢做出这个工具! 通过oneapi把ollama的本地模型转成兼容openai api的方法无法运行。 希望能添加对ollama的支持。
The text was updated successfully, but these errors were encountered:
目前理论上来说, OpenAI 兼容的接口格式都是可以使用的,在 docker 环境变量配置好这几个变量即可
FC_OPENAI_AUTH_KEY: skxxxxxx # 鉴权的key FC_OPENAI_DEFAULT_MODEL: gemini-pro/chatgpt-3.5/... # 默认使用的模型 FC_OPENAI_ENDPOINT: https://xxxxxx # OPENAI API 或兼容平台的API接口路径
如果无法使用的话需要先检查下上面几个变量能否正常使用,此外看下使用其他支持 open ai 接口的工具是否能正常使用.
如果还是不行,麻烦提供下ollama版本以及部署的模型,我本地尝试复现下
Sorry, something went wrong.
No branches or pull requests
感谢做出这个工具!
通过oneapi把ollama的本地模型转成兼容openai api的方法无法运行。
希望能添加对ollama的支持。
The text was updated successfully, but these errors were encountered: