Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenWebUI doesn't connect to FastMLX #35

Open
SwagMuffinMcYoloPants opened this issue Oct 16, 2024 · 3 comments
Open

OpenWebUI doesn't connect to FastMLX #35

SwagMuffinMcYoloPants opened this issue Oct 16, 2024 · 3 comments
Assignees

Comments

@SwagMuffinMcYoloPants
Copy link

I was messing with LMStudio's local server and connecting with openwebui and it was really convenient to connect the two. I don't really need the LMStudio application and would love to just use FastMLX instead. When I try to connect FastMLX as an OpenAI connection, it doesn't return the models. Would it be possible to get FastMLX working with OpenWebUI?

@viljark
Copy link

viljark commented Oct 16, 2024

I haven't had a chance to submit proper PR's back to this repo, but meanwhile you can check my fork where I have implemented OpenWebUI support and some other stuff that I needed.

@Blaizzy
Copy link
Owner

Blaizzy commented Oct 16, 2024

@SwagMuffinMcYoloPants thanks for bringing this up.

I’m working a major release on MLX-VLM and this weekend I will be updating FastMLX with lots of goodies. I can add OpenWebUI.

@Blaizzy
Copy link
Owner

Blaizzy commented Oct 16, 2024

@viljark feel free to propose the changes you want and open a PR with the OpenWebUI support for your fork.

@Blaizzy Blaizzy self-assigned this Oct 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants