Sidekick is an intelligent conversational assistant for your desktop, powered by local AI through Ollama. It provides a seamless desktop experience for interacting with various AI models while keeping all your data private and secure on your local machine.
- 🖥️ Standalone desktop application with an intuitive user interface built with Custom Tkinter
- 🚀 Fast responses with local AI processing
- 🔒 Complete privacy - all processing happens on your machine
- 💾 Conversation history stored in local SQLite database
- 🪶 Lightweight and efficient performance
- 🌐 Cross-platform support for Windows, Mac and Linux
Before installing Sidekick, you need to set up Ollama on your system:
- Install Ollama from ollama.com
- You can install AI models in two ways:
- Through the UI: Open Sidekick's settings (⚙️) and use the "Install New Model" section. Enter the model name (e.g.,
phi:latest
) and click "Install Model". Please be patient during the download as models can be quite large (several gigabytes). - Through the command line:
ollama pull deepseek-r1:latest # or ollama pull phi:latest
- Through the UI: Open Sidekick's settings (⚙️) and use the "Install New Model" section. Enter the model name (e.g.,
Sidekick now provides automated installation scripts that will set up everything you need, including Ollama and a default AI model. Choose the appropriate method for your operating system:
- Open PowerShell as Administrator
- Navigate to the project directory
- Run:
Set-ExecutionPolicy RemoteSigned -Scope Process; .\install.ps1
- Open Terminal
- Navigate to the project directory
- Run:
chmod +x install.sh
./install.sh
The installation script will:
- Install Ollama if not already installed
- Download and set up a small Llama2 model
- Create a Python virtual environment
- Install all required dependencies
- Build the executable using PyInstaller
After installation completes, you'll find the Sidekick executable in the dist
folder.
If you prefer to install components manually, follow these steps:
Sidekick requires Python 3.10 or higher. We recommend creating a virtual environment:
python -m venv venv
# On Windows
.\venv\Scripts\activate
# On Mac/Linux
source venv/bin/activate
Install the requirements:
pip install -r requirements.txt
Create a .env
file in the project root:
OLLAMA_HOST="http://localhost:11434"
AI_MODEL="deepseek-r1:latest" # or your preferred model
Start the application:
python main.py
To create a standalone executable:
pyinstaller --onefile --windowed --icon="src/images/sidekick.ico" --noconsole --hidden-import=tkinter --name="Sidekick" --add-data="src/images:images" src/main.py
The executable will be created in the dist
directory.
Sidekick provides a text-based conversational interface using Ollama's local AI models. All processing happens on your machine, ensuring privacy and quick response times. The application stores conversation history in a local SQLite database, allowing for context-aware interactions while maintaining data privacy.
- Ensure Ollama is running in the background
- Verify your model is properly installed using
ollama list
- Check the Ollama host URL in your
.env
file - Make sure your system meets the minimum requirements for your chosen model
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the BSD 3-Clause License - see the LICENSE file for details.