Offline AI Mode
Run language models locally on your Mac with no internet connection. Complete privacy with Apple Silicon.
Complete privacy with local models
Offline AI mode runs a language model directly on your Mac. Your documents, your questions, and the AI responses all stay on your device. No internet connection is required after the model is downloaded.
Requirements
- Apple Silicon Mac (M1, M2, M3, M4, or later)
- At least 8 GB of unified memory (16 GB recommended for larger models)
- 4 GB of free storage for the default model
Setting up offline AI
- Open Elephas Settings
- Go to the AI Provider tab
- Select Offline AI
- Choose a model size. Smaller models are faster; larger models give better answers
- Click Download. The model downloads once and is stored locally
When to use offline AI
- Working with confidential or sensitive documents
- No internet access available
- You want zero data transmission to external services
- Testing or evaluating Elephas without connecting to any API
Tradeoffs
Local models are smaller than cloud models and may produce less detailed answers for complex questions. For most document Q&A tasks, the quality is sufficient. You can switch between offline and cloud modes at any time to compare results.
Related articles
Privacy & Data ProtectionHow Your Data Is StoredYour documents are indexed and stored entirely on your Mac. Learn exactly how Elephas keeps your data private and where it lives.Privacy & Data ProtectionHow Elephas Compares on PrivacyA concrete comparison of the Elephas privacy model vs ChatGPT, Claude, and other generic AI assistants.SettingsAI Provider ConfigurationStep-by-step setup for OpenAI, Claude, Gemini, Ollama, and other AI providers in Elephas. Switch between Offline, Cloud, and Bring Your Own Key modes.