Iter on the Go
A React Native mobile app for chatting with your local AI models, voice input, and staying connected to your workspace from anywhere on your network.
What you can do
Chat with AI models
Stream-based chat with your local LLMs. Token-by-token streaming via SSE for real-time responses from any model in your fleet.
Voice input
Tap to talk. Voice-to-text powered by the Whisper voice server with real-time interim transcriptions as you speak.
Browse hosts and models
See all Ollama hosts on your network, their available models, and connection status at a glance.
Smart context
Responses include structured context metadata - intents, entities, and action suggestions - so the app knows when to create feature requests or look up projects.
Built with
React Native
Cross-platform iOS and Android from a single TypeScript codebase.
Whisper Voice Server
On-device streaming speech-to-text with VAD for natural voice interaction.
SSE Streaming
Real-time token-by-token responses via Server-Sent Events. No polling.