Iter Iter

Direct LLM Chat

Chat with your local LLM models directly from the dashboard. Streaming responses, session history, and configurable model routing.

LLM chat interface with streaming responses

Chat capabilities

1

Local model access

Chat uses the 'liaison' capability routing to connect to your configured Ollama models.

2

MCP tool integration

Connected MCP servers provide tools the LLM can call: file search, code search, project queries, and more.

3

Streaming responses

Responses stream in real-time from local models. No cloud latency.

4

Session history

Chat interactions are logged with full prompt/response pairs and token usage.

Chat with your local AI