We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
5873283
Adding the Xllama (a custom LLM) to the LLMs server so you can run it locally like Ollama.