AI Provider Configuration
Step-by-step setup for OpenAI, Claude, Gemini, Ollama, and other AI providers in Elephas. Switch between Offline, Cloud, and Bring Your Own Key modes.
Choosing your AI provider
Elephas supports three modes for AI processing. You can switch between them at any time.
Offline AI
Runs a language model locally on your Mac. Requires Apple Silicon. Best for maximum privacy. No internet connection needed after the model is downloaded.
Cloud AI (Elephas proxy)
Uses OpenAI or Claude through the Elephas reverse proxy. You do not need your own API key. Your data is not used for training.
Bring Your Own Key
Connect your own API key from OpenAI, Anthropic (Claude), or Google (Gemini). Requests go directly from your Mac to the provider.
How to change providers
- Open Elephas Settings
- Go to the AI Provider tab
- Select your preferred mode
- If using Bring Your Own Key, paste your API key
- Click Save
Setting up OpenAI
- Go to platform.openai.com/api-keys
- Create a new API key
- Copy the key and paste it in Elephas Settings under AI Provider > OpenAI
- Select your preferred model (GPT-4o recommended for best quality)
Setting up Claude (Anthropic)
- Go to console.anthropic.com
- Create an API key
- Paste it in Elephas Settings under AI Provider > Claude
- Select your preferred model (Claude Sonnet recommended for everyday use)
Setting up Gemini (Google)
- Go to aistudio.google.com
- Create an API key
- Paste it in Elephas Settings under AI Provider > Gemini
Setting up Ollama (local)
Ollama lets you run open-source models locally. Install Ollama from ollama.com, then pull a model (e.g., ollama pull llama3). In Elephas Settings, select Ollama and the model will be detected automatically.
Other supported providers
Elephas also supports Groq, Perplexity, Together AI, Fireworks AI, OpenRouter, DeepSeek, Grok, and Azure OpenAI. Each follows the same pattern: create an API key on the provider's website and paste it in the corresponding section of Elephas Settings.