OpenClient

OpenClient

A native Apple client for LiteLLM, Ollama, and any OpenAI-compatible server. Connect to any LLM provider with a beautiful SwiftUI experience. Run local models or use privacy-respecting APIs β€” your data, your server, your rules.

View on GitHub App Store Coming Soon

Real-time Streaming

Chat with any LLM using Server-Sent Events for instant, token-by-token responses.

Liquid Glass Design

Built with Apple's Liquid Glass design language for a native, modern feel.

Any LLM Provider

Connect directly to Ollama, LiteLLM, or any OpenAI-compatible server β€” Anthropic, Groq, Gemini, and more.

Multi-platform

Native experience on iPhone, iPad, and Mac from a single shared codebase.

Privacy First

Self-hosted β€” your data stays on your server. API keys stored securely in Keychain.

Image Generation

Generate images from text prompts via DALLΒ·E, Stable Diffusion, Gemini, and more.

Voice & Audio

Speech-to-text dictation and text-to-speech playback for hands-free interaction.

Multilingual

Available in English, Spanish, French, German, Italian, Portuguese, Japanese, and more.

Getting Started

  1. Deploy a LiteLLM proxy β€” or run Ollama locally
  2. Download OpenClient, open Settings, and enter your server URL

Xcode 26+ · iOS 26+ / macOS 26+

Open Source

OpenClient is open source. Browse the code, report issues, or contribute on GitHub.