Ollama setup on Linux
Follow these steps to allow the Cognito Chrome extension to connect to Ollama on your Linux system.
1. Install Ollama
If you haven't already, install Ollama from ollama.com using the install script or your distro's package manager. Make sure the ollama binary is in your PATH.
Optionally pull a model:
ollama pull llama3.2
2. Set OLLAMA_ORIGINS and start Ollama
Ollama blocks connections from browser extensions by default. You need to set the OLLAMA_ORIGINS environment variable to allow the Cognito extension to connect.
Open a terminal (e.g. Ctrl + Alt + T) and run:
export OLLAMA_ORIGINS="*"
ollama serve
This starts Ollama with all origins allowed. To run it in the background instead:
export OLLAMA_ORIGINS="*"
nohup ollama serve > /dev/null 2>&1 &
Which value to use?
| Value | Description | When to use |
|---|---|---|
* | Allow all origins | Recommended for most users. Simple and safe — Ollama only listens on your machine, not the internet. |
chrome-extension://bcejicipnpgpcbmnafmnlgmpdingjkdk | Allow only the Cognito extension | Use if you want tighter access control. Update the ID if it changes. |
To use the extension-only value, replace * with chrome-extension://bcejicipnpgpcbmnafmnlgmpdingjkdk in the commands above. If you use a different build (e.g. unpacked), check the Cognito extension → Ollama panel for your extension's ID.
3. Make it persistent (recommended)
The export command above only lasts for the current terminal session. To have OLLAMA_ORIGINS set automatically every time you open a terminal, add it to your shell profile.
If you use Bash (most common — check with echo $SHELL):
echo 'export OLLAMA_ORIGINS="*"' >> ~/.bashrc
source ~/.bashrc
If you use Zsh:
echo 'export OLLAMA_ORIGINS="*"' >> ~/.zshrc
source ~/.zshrc
After updating your profile, restart Ollama so it picks up the new variable:
pkill ollama
nohup ollama serve > /dev/null 2>&1 &
Using systemd? (common for distro-packaged Ollama)
If Ollama runs as a systemd service (e.g. installed via the official script), add the variable to the service configuration:
sudo systemctl edit ollama.service
This opens an editor. Add the following lines:
[Service]
Environment="OLLAMA_ORIGINS=*"
Save and close, then restart:
sudo systemctl daemon-reload
sudo systemctl restart ollama
The variable will now persist across reboots automatically.
4. Verify the setting
To confirm OLLAMA_ORIGINS is set in your current shell:
echo $OLLAMA_ORIGINS
You should see * (or your chosen extension origin). If it shows nothing, the variable isn't set yet — go back to Step 2 or Step 3.
To check that Ollama is running and responding:
curl -s http://127.0.0.1:11434/api/tags
You should see a JSON response listing your installed models.
5. Check that it works
- Open the Cognito extension in Chrome (or another Chromium-based browser).
- Go to Ollama / provider settings and select Ollama.
- Use Refresh or Check connection. It should show a successful connection and list your models.
- Start a chat and choose an Ollama model.
Troubleshooting
- Connection still fails: Make sure Ollama is running (
pgrep -a ollama). If you just addedOLLAMA_ORIGINSto your profile, restart Ollama so it picks up the variable:pkill ollama && nohup ollama serve > /dev/null 2>&1 & - Port 11434 in use: Another Ollama instance (or another program) may already be using the port. Stop it first with
pkill ollama, then start again. - Ollama not in PATH: Use the full path to the
ollamabinary, or add its directory to yourPATHin your shell profile. - Extension shows CORS error: Double-check that
OLLAMA_ORIGINSis set to*or to your extension's exact origin. Restart Ollama after any change.
After setup, pull more models with ollama pull <model> or from the Cognito extension's Ollama panel.