You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was messing with LMStudio's local server and connecting with openwebui and it was really convenient to connect the two. I don't really need the LMStudio application and would love to just use FastMLX instead. When I try to connect FastMLX as an OpenAI connection, it doesn't return the models. Would it be possible to get FastMLX working with OpenWebUI?
The text was updated successfully, but these errors were encountered:
I haven't had a chance to submit proper PR's back to this repo, but meanwhile you can check my fork where I have implemented OpenWebUI support and some other stuff that I needed.
I was messing with LMStudio's local server and connecting with openwebui and it was really convenient to connect the two. I don't really need the LMStudio application and would love to just use FastMLX instead. When I try to connect FastMLX as an OpenAI connection, it doesn't return the models. Would it be possible to get FastMLX working with OpenWebUI?
The text was updated successfully, but these errors were encountered: