Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Integration with lm-studio #8

Open
brandon-fryslie opened this issue Jun 29, 2024 · 1 comment
Open

Feature request: Integration with lm-studio #8

brandon-fryslie opened this issue Jun 29, 2024 · 1 comment

Comments

@brandon-fryslie
Copy link

lm-studio is an open source tool for running LLMs locally. After having done this by hand (as the plugin is doing), and trying other tools to do this, it's clear (to me) that lm-studio is going to be the preferred way to do this going forward. lm-studio compatibility would greatly increase the ability to try different models and reduce the development effort of experimenting with other models. I highly recommend trying it out.

https://github.com/lmstudio-ai

Benefits:

  • Well supported and stable
  • Extremely simple to run a huge variety of models
  • Automatic but easily overridable configuration for models
  • Trivial to run multiple models at once for comparison (if you have the hardware)
  • Full CLI integration via the lms tool: https://github.com/lmstudio-ai/lms
    • Dead simple to start/stop the server, load/unload models, etc with no GUI interaction

Downsides:

  • Commercial use is restricted
  • Main project does not seem to be open source
  • Linux support is not finished yet, only Mac and Windows so far

All in all the downsides are relatively minimal and the upside is that it becomes trivial to try any different model supported by your hardware. The web server simply makes an OpenAI compatible API available, so integration could be as simple as allowing users to provide the API URL they want to use, or as feature rich as allowing users to start the server as needed and choose the models they want to use directly from Binja.

@mrphrazer
Copy link
Owner

Hi, thanks for the suggestion! I do see the points, but won't implement it anytime soon. However, please feel free to do it on your own and maybe submit a PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants