A native Apple client for LiteLLM, Ollama, and any
OpenAI-compatible server.
Your data, your server, your rules.
FEATURES
Connect directly to Ollama, LiteLLM, or any OpenAI-compatible server β Anthropic, Groq, Gemini, and more.
Self-hosted β your data stays on your server. API keys stored securely in Keychain.
Chat with any LLM using Server-Sent Events for instant, token-by-token responses.
Search the web and get up-to-date answers β the model browses in real time to enrich its responses.
Generate images from text prompts via DALLΒ·E, Stable Diffusion, Gemini, and more.
Speech-to-text dictation and text-to-speech playback for hands-free interaction.
Attach images and PDFs to any conversation and let the model analyze, summarize, or answer questions about them.
Autonomous tool-calling loop β the model searches the web, saves memories, and chains actions to complete complex tasks.
Memory that persists across conversations β add your own notes or let the model build it automatically during chat.
GET STARTED
iOS 26+ Β· iPadOS 26+ Β· macOS 26+
OPEN SOURCE
OpenClient is free and open source under the GNU Affero General Public License v3.0. Browse the code, report issues, or contribute on GitHub.
View on GitHub