Chat with AI running on your computer
Enter the URL where Ollama is running on your computer. Usually it's the default below.
Ollama needs CORS enabled to work from a web browser. Start Ollama with:
OLLAMA_ORIGINS="*" ollama serve
On Windows, set the environment variable OLLAMA_ORIGINS=* before running Ollama.
OLLAMA_ORIGINS=*
Select an installed model or download a new one.
No models installed yet!
Switch to "Download New" tab to get your first model.
Starting download...
Use a local model via Ollama
The URL where your Ollama instance is running
This will delete all your chat history from this browser