Deployment
MindRoom can be deployed in various ways depending on your needs.
Deployment Options
| Method | Best For |
|---|---|
| Hosted Matrix + local MindRoom | Simplest setup: run only uvx mindroom run locally |
| Full Stack (Docker Compose) | All-in-one: bundled dashboard + Matrix (Tuwunel) + MindRoom client |
| Docker (single container) | Single MindRoom runtime or when you already have Matrix |
| Kubernetes | Multi-tenant SaaS, production |
| Direct | Development, simple setups |
Bridges
Connect external messaging platforms to Matrix:
- Bridges overview - available bridges and how they work
- Telegram bridge - bridge Telegram chats via mautrix-telegram
Google Services (Gmail/Calendar/Drive/Sheets)
Use these guides if you want users to connect Google accounts in the MindRoom frontend:
- Google Services OAuth (Admin Setup) - one-time setup for shared/team deployments
- Google Services OAuth (Individual Setup) - single-user bring-your-own OAuth app setup
Quick Start
Hosted Matrix + local MindRoom (simplest)
# Creates ~/.mindroom/config.yaml and ~/.mindroom/.env by default
uvx mindroom config init --profile public
$EDITOR ~/.mindroom/.env
uvx mindroom connect --pair-code ABCD-EFGH
uvx mindroom run
Generate the pair code in https://chat.mindroom.chat under:
Settings -> Local MindRoom.
See Hosted Matrix deployment for the full walkthrough.
Full Stack (recommended)
git clone https://github.com/mindroom-ai/mindroom-stack
cd mindroom-stack
cp .env.example .env
$EDITOR .env # add at least one AI provider key
docker compose up -d
The stack exposes MindRoom at http://localhost:8765, the MindRoom client at http://localhost:8080, and Matrix at http://localhost:8008.
The stack uses published mindroom, mindroom-cinny, and mindroom-tuwunel images by default.
If you access it from another device, set CLIENT_HOMESERVER_URL=http://<host-ip>:8008 in .env before starting it.
Direct (Development)
The config file path is set via MINDROOM_CONFIG_PATH and otherwise defaults to ./config.yaml, then ~/.mindroom/config.yaml.
If you want local Matrix + Cinny with a host-installed MindRoom runtime (Linux/macOS), use:
mindroom local-stack-setup --synapse-dir /path/to/mindroom-stack/local/matrix
mindroom run --storage-path ./mindroom_data
Docker (single container)
docker run -d \
--name mindroom \
-p 8765:8765 \
-v ./config.yaml:/app/config.yaml:ro \
-v ./mindroom_data:/app/mindroom_data \
--env-file .env \
ghcr.io/mindroom-ai/mindroom:latest
See the Docker deployment guide for the full single-container setup.
Kubernetes
See the Kubernetes deployment guide for Helm chart configuration.
Required Configuration
Full stack:
Direct and single-container deployments:
- Matrix homeserver - Set
MATRIX_HOMESERVER(must allow open registration for agent accounts) - AI provider keys - At least one of
OPENAI_API_KEY,OPENROUTER_API_KEY, etc. - Persistent storage - Mount
mindroom_data/to persist agent state (includingsessions/,learning/, and memory data)
See the Docker guide for the complete environment variable reference.
Hosted mindroom.chat deployments additionally use values from mindroom connect (MINDROOM_LOCAL_CLIENT_ID, MINDROOM_LOCAL_CLIENT_SECRET, and MINDROOM_NAMESPACE) to bootstrap agent registrations and avoid collisions on shared homeservers.