You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
lm-studio is an open source tool for running LLMs locally. After having done this by hand (as the plugin is doing), and trying other tools to do this, it's clear (to me) that lm-studio is going to be the preferred way to do this going forward. lm-studio compatibility would greatly increase the ability to try different models and reduce the development effort of experimenting with other models. I highly recommend trying it out.
Dead simple to start/stop the server, load/unload models, etc with no GUI interaction
Downsides:
Commercial use is restricted
Main project does not seem to be open source
Linux support is not finished yet, only Mac and Windows so far
All in all the downsides are relatively minimal and the upside is that it becomes trivial to try any different model supported by your hardware. The web server simply makes an OpenAI compatible API available, so integration could be as simple as allowing users to provide the API URL they want to use, or as feature rich as allowing users to start the server as needed and choose the models they want to use directly from Binja.
The text was updated successfully, but these errors were encountered:
Hi, thanks for the suggestion! I do see the points, but won't implement it anytime soon. However, please feel free to do it on your own and maybe submit a PR
lm-studio is an open source tool for running LLMs locally. After having done this by hand (as the plugin is doing), and trying other tools to do this, it's clear (to me) that lm-studio is going to be the preferred way to do this going forward. lm-studio compatibility would greatly increase the ability to try different models and reduce the development effort of experimenting with other models. I highly recommend trying it out.
https://github.com/lmstudio-ai
Benefits:
lms
tool: https://github.com/lmstudio-ai/lmsDownsides:
All in all the downsides are relatively minimal and the upside is that it becomes trivial to try any different model supported by your hardware. The web server simply makes an OpenAI compatible API available, so integration could be as simple as allowing users to provide the API URL they want to use, or as feature rich as allowing users to start the server as needed and choose the models they want to use directly from Binja.
The text was updated successfully, but these errors were encountered: