Cloud AI and Your Data
Understand exactly what is sent to OpenAI, Claude, and Gemini when you use cloud AI mode in Elephas.
How cloud AI works in Elephas
When you use cloud AI mode, Elephas sends your question and relevant document excerpts to an AI provider (OpenAI, Claude, or Gemini) to generate a response. Only the text needed to answer your question is sent, not your entire document library.
What is sent to the AI provider
- Your question text
- Relevant text passages from your documents (typically a small number of short excerpts that matched your question via the local search index)
- System instructions that tell the AI how to respond
Your full documents, file names, file paths, and workspace metadata are not sent.
With Smart Redaction (new in v11.7) enabled, personal information in those passages and your question is anonymized on-device before being sent. See the Sensitive Data Protection article for details.
Elephas reverse proxy
By default, cloud AI requests go through an Elephas reverse proxy. This proxy forwards your request to the AI provider and returns the response. The proxy does not log, store, or inspect your query content. It exists to simplify API key management for users who do not want to create their own API accounts.
Bring Your Own Key
If you prefer, you can connect your own API key directly. Requests go straight from your Mac to the AI provider with no intermediary. Configure this in Settings under the AI Provider tab.
Training and data retention
None of the supported AI providers (OpenAI, Anthropic, Google) use API data for model training. API requests are subject to each provider's data retention policies, which typically retain data for a short period (30 days or less) for abuse monitoring before deletion.
Minimizing data exposure
- Enable Smart Redaction (v11.7+) to anonymize personal information on-device before it is sent
- Use Offline AI for your most confidential documents
- Use Bring Your Own Key to eliminate the reverse proxy from the data path