Local-First Privacy

Familiar is built with a local-first architecture. This means your data stays where it belongs: on your Mac.

Zero Telemetry We do not collect any usage data, analytics, or telemetry. We have no visibility into how you use the app, what you ask your agents, or what they do.
Local Storage All prompt history, agent logs, workspaces, and configurations are stored locally on your machine. Nothing is ever uploaded to our servers.
Secure Credentials Your LLM provider API keys are stored securely in the macOS Keychain. Familiar accesses them only to perform the tasks you request.
Direct Inference When you use a cloud provider like OpenAI or Anthropic, Familiar communicates directly with their API from your machine. Your prompts and data are never proxied through our infrastructure.
Manual Bug Reporting The app does not automatically report errors or crashes. Feedback and bug reports are purely voluntary and must be initiated by you.

Third-Party Services

While Familiar is designed for privacy, the LLM providers you connect have their own policies.

Provider Policies

By connecting a service like OpenAI, Anthropic, or Google Gemini, you are subject to their respective terms and privacy policies.

Local Models

For maximum privacy, you can use local providers like Ollama or LM Studio, ensuring your data never leaves your machine at all.