A Chrome extension that provides a local AI assistant powered by Ollama, running completely locally on your machine. It adds a convenient sidebar interface for interacting with local LLMs through Ollama.
- 🤖 Local AI assistant powered by Ollama
- 🔒 Completely private - all processing happens on your machine
- 📝 Interactive sidebar interface
- 💬 Real-time streaming responses
- 🔍 Text selection support for quick queries
- 🎯 Explain and translate functionality
- Install Ollama
- Pull the desired model:
ollama pull llama3.2:1b
- Chrome Extension Manifest Version: v3
- Supported External Providers:
- OpenAI API (o1, o1-mini, o3-mini, o3-mini-high, gpt-4o, gpt-4o-mini)
- Anthropic API (Claude 3.5 and Claude 3 series)
- Google Gemini API (Gemini 1.5 and 1.0 series)
- Perplexity API (Sonar series)
- Grok API (Grok 2 series)
- OpenRouter API (custom model selection)
- Dependencies:
- Marked.js for Markdown rendering
- Chrome Extension APIs:
- activeTab
- scripting
- sidePanel
- storage
- WebCrypto API for secure key storage
- Native Messaging for Ollama integration
- Clone this repository or download the source code
- Open Chrome and go to
chrome://extensions/
- Enable "Developer mode" in the top right
- Click "Load unpacked" and select the extension directory
- Start Ollama with the correct CORS settings:
OLLAMA_ORIGINS="chrome-extension://*" ollama serve
To make this setting permanent, you can:
- On macOS/Linux: Add to your shell profile (~/.zshrc, ~/.bashrc, etc.):
export OLLAMA_ORIGINS="chrome-extension://*"
- Or run Ollama as a service with these settings
- The extension icon should appear in your Chrome toolbar
- Click the icon to open the AI assistant sidebar
- Click the extension icon to toggle the sidebar
- Type your question in the input box and press Enter or click the send button
- Select text on any webpage and use the popup buttons to:
- Explain: Get an explanation of the selected text
- Translate: Translate the selected text to English
-
If you see a "403 Forbidden" error, make sure:
- Ollama is running (
ollama serve
) - The OLLAMA_ORIGINS environment variable is set correctly
- You're using the correct model name (llama3.2:1b)
- Ollama is running (
-
If the extension isn't responding:
- Check if Ollama is running (
ollama list
) - Try restarting Ollama
- Reload the extension in chrome://extensions
- Check if Ollama is running (
The extension is built using:
- Vanilla JavaScript
- Chrome Extension Manifest V3
- Ollama API
Listed under issues tab on Github