Getting Started
Lokei is a private AI assistant that lives in your Mac's menu bar. On supported Macs it can use Apple Intelligence for on-device responses, and it also supports local Ollama models plus optional OpenAI-compatible providers you configure.
You need a Mac running macOS 13 Ventura or later. Apple Intelligence support depends on your Mac model, macOS version, and Apple Intelligence availability. If you choose the Ollama backend, you also need Ollama installed and running.
Lokei works on any Mac running macOS 13 Ventura or later, including both Intel and Apple Silicon Macs. Apple Intelligence is only offered on Macs that support it; other Macs keep the Ollama-first setup flow.
Lokei is Mac-first today, and an iOS companion is in development. The iPhone app is being built around the same privacy posture: Apple Intelligence on eligible devices, local-network AI when you use your Mac as the server, and no Lokei cloud in the middle. It is not on the App Store yet.
When Lokei launches on the Mac App Store, install it and follow onboarding. Supported Macs can start with Apple Intelligence. If you choose Ollama, Lokei shows the Ollama download link and model setup steps inline, then lets you manage models from Settings → AI & Models.
Privacy & Data
Lokei is private by architecture. With Apple Intelligence or local Ollama, processing stays on your Mac. If you configure a remote provider, prompts sent to that provider leave your Mac and are governed by that provider's terms, but Lokei still does not collect them.
No. Lokei collects zero data: no conversations, no usage stats, no crash reports sent to us, no analytics, nothing. Your chat history is stored locally by default. If you enable iCloud sync, it syncs through your Apple iCloud account, not through Lokei servers.
No Lokei account is required. App Store distribution will be handled by Apple when Lokei launches. A separate provider account is only relevant if you choose to configure a remote provider yourself.
Yes, with local or on-device backends. Apple Intelligence availability is handled by macOS, and Ollama works offline after Ollama and your models are installed. Remote providers require internet access.
AI Models
If you use Apple Intelligence, macOS manages the model. If you use Ollama, it depends on your Mac's RAM: smaller 1-3B models are best for 8GB Macs, while 16GB+ Macs can usually run larger 7B models. Lokei's Model Library shows RAM requirements for every Ollama model.
No. Apple Intelligence needs no model download inside Lokei. For Ollama, Lokei includes a built-in Model Library where you can browse and download models with a single click. Models are rated by RAM requirement so you know what should run well on your Mac.
Small models (1-3B parameters) are typically 1–2 GB. Medium models (7B) are around 4–5 GB. Lokei's Model Library shows the exact download size for each model before you download it. Models are stored in Ollama's directory, not inside the Lokei app itself.
Yes. Lokei automatically detects all models installed through Ollama and shows them in the model picker. If you've been using Ollama before Lokei, all your existing models are immediately available.
Features & Usage
Click the Lokei leaf icon in your menu bar — it's always there, ready in one click. For longer sessions, expand Lokei into a full resizable window with a sidebar for managing multiple conversations.
Specialist Modes (called Presets in the app) are different system prompts that tune Lokei's personality and behavior for specific tasks. Built-in modes include General, Code Helper, Debug Mode, Explain It, Writing, and Brainstorm. You can also create unlimited custom modes in Settings → Presets.
Project Context lets you tell Lokei about your project once — your tech stack, coding style, architecture decisions — and Lokei quietly includes that knowledge with every message you send. No more repeating yourself every conversation. Set it up in Settings → Context.
Yes. You can drag and drop files onto Lokei's chat window, click the file icon in the input bar, attach images, or paste clipboard content. Lokei supports code files, text files, PDFs, images, and more. Large files are automatically truncated to keep responses fast.
Yes. Click the export icon in the header to save your current conversation as Markdown, HTML, or JSON. This works in both the menu bar popover and the expanded full window.
Availability & Pricing
The plan is a one-time Mac App Store purchase with no Lokei subscription. Final pricing will be shown on the App Store when Lokei launches.
Updates will be delivered through the Mac App Store after launch. Any final purchase and update terms will be shown in the App Store listing.
Lokei itself is planned without a subscription. Apple Intelligence and Ollama do not add per-message API fees. If you configure a remote provider, that provider may have its own pricing or usage limits.
After the Mac App Store launch, Apple's App Store policies determine use across Macs associated with your Apple ID, including any applicable Family Sharing support.
Troubleshooting
This only applies if you choose the Ollama backend. Open your Applications folder and launch the Ollama app, or run
ollama serve in Terminal. Once Ollama is running, click "Try Again" in Lokei. If you haven't installed Ollama yet, download it from ollama.com.For Ollama, switch to a smaller model; 3B parameter models are significantly faster than 7B models, especially on 8GB RAM Macs. Long conversations can slow down over time, so use "Summarize & Compress" in the chat or start another chat to speed things back up. Remote provider speed depends on the provider and your connection.
Email us at hello@getlokei.com. We're a small team and we read every message.
Still have a question?
We're happy to help. Reach out and we'll get back to you quickly.
hello@getlokei.com