MinimalChat is an open-source, lightweight chat application supporting multiple language models like OpenAI o1, DeepSeek R1 and various Local/Custom Model Endpoints. It is designed to be simple, fully featured, and highly responsive, with full mobile PWA support.
docker pull tannermiddleton/minimal-chat:latest
compressed-demo.mp4
Thanks to Web LLM, you can now download and cache popular LLM models like llama-3-8b-instruct
directly in your browser.
- Install packages:
npm install
- Build the app:
npm run build
- Start local server:
npm run preview
for Production mode ornpm run dev
for development mode.
Here's a section you can add to the README to stress the importance of enabling CORS for locally hosted API servers:
When hosting API servers locally (such as LM Studio, Ollama, or other custom endpoints), it is crucial to properly configure CORS (Cross-Origin Resource Sharing) settings. This ensures that your browser-based application (like MinimalChat) can communicate with your local API server without being blocked by browser security policies.
The steps to enable CORS depend on the API server you are using:
- When starting your LM Studio server

- Ensure that your API server is running and accessible.
- Verify that the CORS settings are correctly configured and applied.
- Check the browser's developer console for detailed error messages.
- If using a custom domain or port, make sure it is included in the allowed origins list.
By properly configuring CORS, you can ensure seamless communication between your locally hosted API server and the MinimalChat application.
Visit the Wiki for detailed configuration options
- Minimal layout
- Voiced conversational interactions with STT and TTS
- Supports multiple language models:
- Any Open AI Response Formatted APIs (custom/local)
- Load and host full models locally in your own browser with WebLLM
- Switch models mid-conversation
- Swipe gestures for quick settings and conversation access
- Edit, regenerate, or delete past messages
- Markdown support
- Code syntax highlighting
- Basic DALL-E 3 integration
- Conversation importing/exporting
- Mobile responsive layout
- PWA support
Yes, MinimalChat is open-source and free. However, API keys are required for some language models.
Yes, by using LM Studio to host a local LLM Model or by loading a full model into your browser.
Yes, all conversations are stored locally on your device.
Yes, it is fully mobile-compatible and can be installed as a PWA.
- Swipe left on the input box to open Conversations.
- Swipe right on the input box to open Settings.
- Double tap the settings page to expand/collapse the side panel.
Supports any API endpoint returning responses formatted according to OpenAI's specifications.
We welcome contributions! Please:
- Submit issues via the issue tracker
- Fork the repository, make changes, and submit a pull request
- Follow coding style and conventions
- Provide clear commit messages and pull request descriptions
- Ensure a stable internet connection
- Verify API keys and permissions
- Clear browser cache as a last resort
Report issues via the issue tracker
Licensed under the MIT License. See LICENSE for details.
For questions, feedback, or suggestions:
- GitHub Issues
- Discord:
fingerthief#0453
Thank you for using MinimalChat!