Connect a Cloud Provider
This tutorial walks you through connecting a cloud AI provider. By the end, you’ll have switched from the default local model to a more powerful cloud backend.
Daneel AI works out of the box with WebGPU (local inference). But if you want higher-quality responses, you can connect a cloud provider. This tutorial covers the three main options.
Option A: Claude (Anthropic API)
Section titled “Option A: Claude (Anthropic API)”Claude is Anthropic’s flagship model family. It offers the highest quality responses and supports native tool calling with MCP servers.
- Open Daneel’s settings (gear icon on the launcher).
- Navigate to Claude in the sidebar.
- Paste your Anthropic API key. The key is encrypted with AES-256-GCM and stored locally — it never leaves your browser unencrypted.
- Select a model:
- Claude Opus 4.7 — most capable, hybrid reasoning for coding and vision
- Claude Opus 4.6 — previous flagship, same pricing as 4.7
- Claude Sonnet 4.6 — balanced quality and speed
- Claude Haiku 4.5 — fastest, lowest cost
- Close settings. In the chat panel, switch the provider dropdown to Claude.
You’re now chatting with Claude. You’ll see a cost annotation next to each response showing token usage.
Option B: Ollama (local server)
Section titled “Option B: Ollama (local server)”Ollama runs open-source models on your machine. Responses stay on your local network — nothing reaches the internet.
- Install Ollama on your computer.
- Pull a model:
ollama pull llama3.2(or any model you prefer). - In Daneel’s settings, navigate to Ollama.
- Set the base URL (default:
http://localhost:11434). Daneel auto-probes the connection. - Select a model from the dropdown — Daneel lists all models installed on your Ollama server.
- Close settings and switch the provider dropdown to Ollama.
Ollama supports tool calling with MCP servers, model management (pull, delete), and think-block streaming.
Option C: Azure OpenAI (enterprise)
Section titled “Option C: Azure OpenAI (enterprise)”For enterprise environments with Azure OpenAI Service deployments.
- In Daneel’s settings, navigate to Azure OpenAI.
- Enter your Azure endpoint URL and deployment name.
- Choose an authentication method:
- API Key — paste your Azure API key
- Entra ID (OAuth2) — authenticate via Microsoft identity
- Select your deployed model.
- Close settings and switch the provider to Azure OpenAI.
See How to Set Up Azure OpenAI for the detailed guide.
Option D: Gemini Nano (Chrome built-in)
Section titled “Option D: Gemini Nano (Chrome built-in)”Gemini Nano is a small model built into Chrome. No downloads, no API keys.
- Make sure you’re on Chrome 120+ with the Gemini Nano flag enabled.
- In Daneel’s settings, navigate to Gemini Nano.
- Daneel detects availability automatically. If available, select a language.
- Switch the provider dropdown to Gemini Nano.
Gemini Nano runs on-device with no internet required, but it’s a small model — expect lower quality than Claude or Ollama with larger models.
Comparing providers
Section titled “Comparing providers”For a deeper comparison of trade-offs between local and cloud providers, see The Provider Spectrum.
Next steps
Section titled “Next steps”- Connect an MCP server to give your AI access to external tools
- Create a custom agent with a specialized prompt
- Read about the privacy model to understand the data flow for each provider