OpenClient
iOS Β· iPadOS Β· macOS

OpenClient

A native Apple client for LiteLLM, Ollama, and any OpenAI-compatible server.
Your data, your server, your rules.

Everything you need to talk to any LLM

πŸ€–

Any LLM Provider

Connect directly to Ollama, LiteLLM, or any OpenAI-compatible server β€” Anthropic, Groq, Gemini, and more.

πŸ”’

Privacy First

Self-hosted β€” your data stays on your server. API keys stored securely in Keychain.

πŸ’¬

Real-time Streaming

Chat with any LLM using Server-Sent Events for instant, token-by-token responses.

πŸ”

Web Browsing

Search the web and get up-to-date answers β€” the model browses in real time to enrich its responses.

πŸ–ΌοΈ

Image Generation

Generate images from text prompts via DALLΒ·E, Stable Diffusion, Gemini, and more.

πŸŽ™οΈ

Voice & Audio

Speech-to-text dictation and text-to-speech playback for hands-free interaction.

πŸ“Ž

Vision & Documents

Attach images and PDFs to any conversation and let the model analyze, summarize, or answer questions about them.

πŸ€–

Agent Mode

Autonomous tool-calling loop β€” the model searches the web, saves memories, and chains actions to complete complex tasks.

🧠

Persistent Memory

Memory that persists across conversations β€” add your own notes or let the model build it automatically during chat.

Up and running in two steps

  1. Deploy a LiteLLM proxy β€” or run Ollama locally
  2. Download OpenClient, open Settings, and enter your server URL

iOS 26+ Β· iPadOS 26+ Β· macOS 26+

Built in the open, auditable by anyone

OpenClient is free and open source under the GNU Affero General Public License v3.0. Browse the code, report issues, or contribute on GitHub.

View on GitHub