Ollama Cloud models + Cognito
Use Ollama Cloud models (e.g. gpt-oss:20b-cloud, gpt-oss:120b-cloud) with the Cognito Chrome extension. These models run on Ollama's cloud infrastructure and are accessed through your local Ollama installation. You need to sign in at ollama.com, link your device with ollama signin, and make sure Ollama Cloud is enabled in your local config.
What are Ollama Cloud models?
Cloud models are large models (e.g. 20B, 120B parameters) that run on Ollama's servers instead of your machine. They are useful when:
- You want to use a model that's too large to run locally (e.g. 120B parameters)
- You want faster inference without using your machine's GPU/CPU
- You're on a laptop or a machine without a powerful GPU
You use cloud models the same way as local models in the Cognito extension — the extension talks to your local Ollama, and Ollama routes cloud model requests to its servers.
Before you start
- Ollama installed and running. If you haven't set up Ollama yet, see Overview and the platform guides (Windows, macOS, Linux).
- OLLAMA_ORIGINS configured so the Cognito extension can connect (covered in the platform guides above).
- An account on ollama.com.
Step 1: Sign in at ollama.com
In your browser, go to ollama.com and sign in (or create an account). You need to be logged in before linking your device.
Step 2: Link this device with ollama signin
Open a terminal on this computer:
- Windows: Command Prompt or PowerShell (no need for Administrator).
- macOS: Terminal (press
Cmd + Space, type "Terminal", press Enter). - Linux: Terminal (e.g.
Ctrl + Alt + T).
Run:
ollama signin
Your browser will open so you can connect this device to your Ollama account. Complete the steps in the browser (approve the device, etc.).
When it says you're connected successfully, you can close the browser tab.
Step 3: Make sure Ollama Cloud is enabled
Ollama has a local config file (server.json) that controls whether cloud/remote model access is allowed. If cloud access is disabled, you'll get a 403 Forbidden error when trying to use cloud models — even if you're signed in.
Where is server.json?
| OS | Config file path |
|---|---|
| macOS | ~/.ollama/server.json |
| Linux | ~/.ollama/server.json |
| Windows | %USERPROFILE%.ollama\server.json |
Check and update server.json
Open the config file and make sure disable_ollama_cloud is set to false:
{
"disable_ollama_cloud": false
}
⚠️ Important: If the file already has other settings, don't overwrite the whole file — just add or update the
"disable_ollama_cloud": falsekey inside the existing JSON.
macOS
Option A — Using Finder:
- Open Finder, press
Cmd + Shift + G, type~/.ollamaand press Enter. - Look for
server.json:- If it exists: double-click to open in TextEdit (or right-click → Open With → your preferred editor).
- If it doesn't exist: create a new file called
server.jsonin that folder.
- Make sure
"disable_ollama_cloud": falseis in the JSON.
Option B — Using Terminal:
cd ~/.ollama
nano server.json
Add or update "disable_ollama_cloud": false, then save (Ctrl + O, Enter, Ctrl + X).
Windows
Option A — Using File Explorer:
- Press
Win + R, type%USERPROFILE%\.ollamaand press Enter. - Look for
server.json:- If it exists: right-click → Open with → Notepad.
- If it doesn't exist: create a new file called
server.jsonin that folder.
- Make sure
"disable_ollama_cloud": falseis in the JSON.
Option B — Using PowerShell:
cd "$env:USERPROFILE\.ollama"
notepad server.json
Add or update "disable_ollama_cloud": false and save.
Linux
Option A — Using a file manager:
- Open your file manager and navigate to
~/.ollama/(pressCtrl + Hto show hidden files/folders). - Look for
server.json:- If it exists: open with any text editor.
- If it doesn't exist: create a new file called
server.jsonin that folder.
- Make sure
"disable_ollama_cloud": falseis in the JSON.
Option B — Using Terminal:
cd ~/.ollama
nano server.json
Add or update "disable_ollama_cloud": false, then save (Ctrl + O, Enter, Ctrl + X).
Step 4: Restart Ollama
After sign-in and config changes, you need to quit and reopen Ollama so it picks up everything:
-
Quit Ollama completely:
- Windows: Right-click the Ollama icon in the system tray → Quit.
- macOS: Click the Ollama icon in the menu bar → Quit Ollama.
- Linux:
pkill ollamain Terminal, or quit from the app. If using systemd:sudo systemctl restart ollama.
-
Start Ollama again:
- Windows: From the Start menu, or run
ollama servein a new terminal. - macOS: Open from Applications or Spotlight (
Cmd + Space→ "Ollama"). - Linux: Run
ollama serve, or start from your app menu.
- Windows: From the Start menu, or run
Ollama is now running with access to your cloud models.
Step 5: Use cloud models in the Cognito extension
- Open the Cognito extension in Chrome.
- Go to Ollama / provider settings and choose Ollama.
- Use Refresh or Check connection. You should see both local and cloud models (e.g.
gpt-oss:20b-cloud,gpt-oss:120b-cloud). - Start a chat and select a cloud model.
Cloud models are used the same way as local ones; the extension talks to your local Ollama, and Ollama routes cloud model requests to its servers using your account.
Common errors
403 Forbidden — "Cloud model disabled"
If you see a 403 error when using a cloud model, it usually means one of:
- Ollama Cloud is disabled in
server.json→ Set"disable_ollama_cloud": false(see Step 3 above). - You're not signed in → Run
ollama signinagain (see Step 2). - Ollama wasn't restarted after changing the config or signing in → Quit and restart Ollama (see Step 4).
403 Forbidden — CORS error (not a cloud issue)
If you see a 403 error with a local model (not a cloud model), the issue is likely CORS — Ollama is blocking the extension from connecting. This is a different problem. See the platform setup guides:
Cloud models don't appear in the model list
- Make sure you ran
ollama signinand completed the browser flow. - Quit Ollama fully and start it again.
- Check ollama.com → your account → devices, and verify this device is linked.
"Not signed in" or authentication errors
- Run
ollama signinagain and complete the browser steps. - Make sure you're signed in at ollama.com in your browser before running
ollama signin. - Quit and restart Ollama after signing in.
Extension can't connect to Ollama at all
You still need OLLAMA_ORIGINS set for the Cognito extension to talk to your local Ollama. Cloud sign-in does not replace that. See Windows, macOS, Linux.
Quick checklist
If cloud models aren't working, verify each of these:
- [ ] Signed in at ollama.com in your browser
- [ ] Ran
ollama signinin Terminal and completed the device link - [ ]
server.jsonhas"disable_ollama_cloud": false(or the key is absent — absent means enabled by default) - [ ]
OLLAMA_ORIGINSis set (for the extension to connect) - [ ] Ollama was restarted after all changes
- [ ] Extension shows cloud models in the model dropdown
💡 Don't need cloud models? Switch to a local model instead — llama3.2:3b, qwen2.5:3b, or phi3:mini are fast and run entirely on your machine.
For local-only setup, see the Overview and the platform guides.