
Daneel AI is designed with a privacy gradient — you choose how much (or how little) of your data leaves your machine. This page explains the data flow for each provider and feature.

## The privacy gradient

Every AI provider in Daneel has a **data residency** classification:

| Level | Meaning | Providers |
|-------|---------|-----------|
| **On-device** | Data never leaves your browser process | WebGPU, Gemini Nano |
| **Local network** | Data goes to a server on your LAN, never to the internet | Ollama |
| **Your cloud** | Data goes to infrastructure you control | Azure OpenAI |
| **Third-party cloud** | Data goes to an external API provider | Claude (Anthropic) |

You can filter models by privacy level in **Settings > AI Models** to find models that match your requirements.

## What stays local — always

Regardless of which LLM provider you use, these operations never leave your browser:

- **Embedding** — All vector embeddings are generated locally by the BGE Small model running on WebGPU (or WASM fallback). Your text is chunked and embedded on-device.
- **Vector search** — Cosine similarity search runs in IndexedDB or GPU-accelerated memory. Search queries never leave the browser.
- **Document storage** — Vault documents, site indexes, and knowledge graphs are stored in IndexedDB in your browser profile.
- **Settings and credentials** — All configuration data, including encrypted API keys, stays in Chrome's local storage.
- **Content extraction** — Page text extraction (Readability.js, Turndown) runs in the content script or service worker.

## What leaves your machine — by choice

When you select a cloud LLM provider, the following data is sent to that provider's API:

- The assembled prompt (page content or RAG context + your question + conversation history)
- The AI's response streams back

This is the standard flow for any AI chat application. The difference is that with Daneel, you can avoid it entirely by using WebGPU or Ollama.

### Claude (Anthropic)

- Data sent to Anthropic's API servers
- API key is encrypted with AES-256-GCM before storage; transmitted via HTTPS
- Anthropic's [data usage policy](https://www.anthropic.com/policies) applies
- The `anthropic-dangerous-direct-browser-access: true` header is set (required for browser-based API calls)

### Ollama

- Data sent to your Ollama server (default: `localhost:11434`)
- Stays on your local network — nothing reaches the internet
- You control the server and its data retention

### Azure OpenAI

- Data sent to your Azure OpenAI deployment in your tenant
- Your Azure data residency and compliance policies apply
- Authentication via API key or Entra ID (your Azure AD)

### MCP tool calls

When using MCP servers, tool call parameters and results are exchanged with the remote server. Each MCP server has its own data handling policy. OAuth-connected servers (Stripe, Notion, etc.) operate under their respective privacy policies.

## Environment context

When enabled, Daneel injects your approximate location (city level) and current datetime into agent system prompts. This data is:

- **Location** — resolved once per session via browser geolocation + OpenStreetMap reverse geocoding. Stored only in memory (never persisted to disk). Sent to your LLM provider as part of the prompt.
- **Datetime** — computed locally from `Date` and `Intl.DateTimeFormat`. No network calls.

Both are gated by toggles in **Settings > Privacy** and are off/on by default respectively. The telemetry geolocation system (below) is completely separate and does not share data with context injection.

See [Environment Context](/concepts/context-injection/) for the full architecture.

## Telemetry

Daneel includes optional analytics (GA4 Measurement Protocol). When enabled:

**Collected:** feature usage counters (chat, search, crawl, model load), provider and model name, OS, Chrome version, language, country/region.

**Never collected:** page content, URLs you visit, chat messages, documents, API keys, or any personally identifiable information.

Telemetry is opt-in/out in **Settings > Privacy**. Disabling it stops all analytics collection immediately.

## Encryption

- Claude API keys: AES-256-GCM encryption at rest in Chrome storage
- MCP OAuth tokens: stored in Chrome's local storage with auto-migration from legacy formats
- S3 credentials: stored in Chrome storage, excluded from data exports
- Azure SAS URLs: stored in Chrome storage, excluded from data exports

## Practical guidance

- **Maximum privacy:** Use WebGPU for LLM + default local embedding. Zero data leaves your machine.
- **Privacy with power:** Use Ollama on localhost. Data stays on your machine but you get access to larger models.
- **Enterprise compliance:** Use Azure OpenAI. Data stays in your Azure tenant under your compliance umbrella.
- **Best quality:** Use Claude. Prompts are sent to Anthropic's API, but embedding and search remain local.

To see this in action, follow [Your First Page Chat](/guides/first-page-chat/) with the WebGPU provider — everything runs locally.
