Your AI coworker for any folder: local-first, secure by design, cross-platform, and built for supervised automation.
Tandem is an engine-owned workflow runtime for coordinated autonomous work.
While the current landscape of AI agents is flooded with “chat-first assistants,” these conversational routing models inevitably fail at scale due to context bloat and concurrency blindness. Chat is fine as an interface, but it is weak as an authoritative coordination substrate for parallel, durable engineering workflows.undefined
Tandem takes a fundamentally different approach to tackle the complex realities of agentic engineering. We treat autonomous execution as a distributed systems problem, prioritizing robust engine state over fragile chat transcripts.
It provides durable coordination primitives, including blackboards, workboards, explicit task claiming, operational memory accumulation, and checkpoints, allowing multiple agents to work concurrently on complex, long-running software engineering and automation tasks without colliding.
Durable State → Workboards → Agent Swarm → Artifacts
undefined→ Connect an agent via MCP · Download desktop app · Read the docsundefined
Install the master CLI, then bootstrap the panel and its engine service:
npm i -g @frumu/tandem
tandem install panel
tandem panel init
tandem panel open
Use this when you want the browser-based control center backed by the engine.
For local installs, you can now open Settings -> Providers -> openai-codex and choose Connect Codex Account to sign in through the browser instead of pasting an OpenAI API key.
openai-codex.Generate a fully editable control panel app in your own folder:
npm create tandem-panel@latest my-panel
cd my-panel
npm install
npm run dev
Use this when you want to customize routes, pages, themes, styles, or runtime behavior without editing node_modules.
If you want an existing agent to help install or configure Tandem, connect that agent to Tandem’s MCP interface first. The MCP docs explain how to wire your own agent into Tandem so it can assist with setup, configuration, and follow-up tasks:
If you only want the engine runtime, you can keep it foreground-only:
tandem-engine serve --hostname 127.0.0.1 --port 39731
npm i -g @frumu/tandem-tui && tandem-tuinpm install @frumu/tandem-client or pip install tandem-clientgraph TD
%% Clients
Desktop[Desktop App]
ControlPanel[Web Control Panel]
TUI[Terminal UI]
API[SDKs & API Clients]
subgraph "Tandem Engine (Source of Truth)"
Orchestrator[Orchestration & Approvals]
Blackboard[(Blackboard & Shared State)]
Memory[(Vector Memory & Checkpoints)]
Worktrees[Git Worktree Isolation]
end
subgraph "Agent Swarm"
Planner[Planner Agent]
Builder[Builder Agent]
Validator[Verifier Agent]
end
Desktop -.-> Orchestrator
ControlPanel -.-> Orchestrator
TUI -.-> Orchestrator
API -.-> Orchestrator
Orchestrator --> Blackboard
Orchestrator --> Memory
Orchestrator --> Worktrees
Blackboard <--> Planner
Blackboard <--> Builder
Blackboard <--> Validator
| Task | What Tandem does |
|---|---|
| Refactor a codebase safely | Scans files, proposes a staged plan, shows diffs, and applies approved changes |
| Research and summarize sources | Reads multiple references and outputs structured summaries |
| Generate recurring reports | Runs scheduled automations and produces markdown/dashboard artifacts |
| Connect external tools through MCP | Uses configured MCP connectors with approval-aware execution |
| Operate AI workflows via API | Run sessions through local/headless HTTP + SSE endpoints |
.env, .ssh/*, *.pem, *.key, secrets folders).pptx) generationThe SDKs are API clients. They do not bundle tandem-engine.
You need a running Tandem runtime (desktop sidecar or headless engine) and then use the SDKs to create sessions, trigger runs, and stream events.
Runtime options:
Desktop app running locally (starts the sidecar runtime)
Headless engine via npm:
npm install -g @frumu/tandem
tandem-engine serve --hostname 127.0.0.1 --port 39731
TypeScript SDK: @frumu/tandem-client
Python SDK: tandem-client
Engine package: @frumu/tandem
// npm install @frumu/tandem-client
import { TandemClient } from "@frumu/tandem-client";
const client = new TandemClient({ baseUrl: "http://localhost:39731", token: "..." });
const sessionId = await client.sessions.create({ title: "My agent" });
const { runId } = await client.sessions.promptAsync(sessionId, "Summarize README.md");
for await (const event of client.stream(sessionId, runId)) {
if (event.type === "session.response") process.stdout.write(event.properties.delta ?? "");
}
# pip install tandem-client
from tandem_client import TandemClient
async with TandemClient(base_url="http://localhost:39731", token="...") as client:
session_id = await client.sessions.create(title="My agent")
run = await client.sessions.prompt_async(session_id, "Summarize README.md")
async for event in client.stream(session_id, run.run_id):
if event.type == "session.response":
print(event.properties.get("delta", ""), end="", flush=True)
Configure providers in Settings.
| Provider | Description | Get API key |
|---|---|---|
| undefinedOpenAI Codex Accountundefined | Browser sign-in for local Codex-account usage | Local control panel: Settings -> Providers -> openai-codexundefined |
| undefinedOpenRouter ⭐ | Access many models through one API | openrouter.ai/keys |
| undefinedOpenCode Zenundefined | Fast, cost-effective models optimized for coding | opencode.ai/zen |
| undefinedAnthropicundefined | Anthropic models (Sonnet, Opus, Haiku) | console.anthropic.com |
| undefinedOpenAIundefined | GPT models and OpenAI endpoints | platform.openai.com |
| undefinedOllamaundefined | Local models (no remote API key required) | Setup Guide |
| undefinedCustomundefined | OpenAI-compatible API endpoint | Configure endpoint URL |
Notes:
openai-codex is currently intended for local engine-backed Tandem setups.openai provider.websearch can now be configured directly from:
Recommended default:
Backend = autoauto prefers configured providers and can fall through across backends instead of pinning the engine to a single hosted search path. For headless installs you can still configure this through env vars:
TANDEM_SEARCH_BACKEND=auto
TANDEM_BRAVE_SEARCH_API_KEY=...
TANDEM_EXA_API_KEY=...
TANDEM_SEARXNG_URL=http://127.0.0.1:8080
TANDEM_SEARCH_URL=https://search.tandem.ac
If Brave is rate-limited and Exa is configured, auto can continue with Exa instead of immediately surfacing search as unavailable.
MIT OR Apache-2.0 for most Rust crates, and BUSL-1.1 for tandem-plan-compiler as documented in docs/LICENSING.md.127.0.0.1) and configured endpoints.For the full threat model and reporting process, see SECURITY.md.
Advanced MCP behavior (including OAuth/auth-required flows and retries) is documented in docs/ENGINE_CLI.md.
| Platform | Additional requirements |
|---|---|
| Windows | Build Tools for Visual Studio |
| macOS | Xcode Command Line Tools: xcode-select --install |
| Linux | libwebkit2gtk-4.1-dev, libappindicator3-dev, librsvg2-dev, build-essential, pkg-config |
git clone https://github.com/frumu-ai/tandem.git
cd tandem
pnpm install
cargo build -p tandem-ai
pnpm tauri dev
pnpm tauri build
For local self-built updater artifacts, generate your own signing keys and configure:
pnpm tauri signer generate -w ./src-tauri/tandem.keyTAURI_SIGNING_PRIVATE_KEYTAURI_SIGNING_PASSWORDpubkey in src-tauri/tauri.conf.jsonReference: Tauri signing documentation
Output paths:
# Windows: src-tauri/target/release/bundle/msi/
# macOS: src-tauri/target/release/bundle/dmg/
# Linux: src-tauri/target/release/bundle/appimage/
If a downloaded .dmg shows “damaged” or “corrupted”, Gatekeeper is usually rejecting an app bundle/DMG that is not Developer ID signed and notarized.
aarch64/arm64 vs x86_64/x64).Right click -> Open or System Settings -> Privacy & Security -> Open Anyway).Contributions are welcome. See CONTRIBUTING.md.
# Run lints
pnpm lint
# Run tests
pnpm test
cargo test
# Format code
pnpm format
cargo fmt
Engine-specific build/run/smoke instructions: docs/ENGINE_TESTING.md
Engine CLI usage reference: docs/ENGINE_CLI.md
Engine runtime communication contract: docs/ENGINE_COMMUNICATION.md
.github/workflows/release.yml (tag pattern v*).github/workflows/publish-registries.yml (manual trigger or publish-v*)tandem/
├── src/ # React frontend
│ ├── components/ # UI components
│ ├── hooks/ # React hooks
│ └── lib/ # Utilities
├── src-tauri/ # Rust backend
│ ├── src/ # Rust source
│ ├── capabilities/ # Permission config
│ └── binaries/ # Sidecar (gitignored)
├── scripts/ # Build scripts
└── docs/ # Documentation
sqlite-vecIf Tandem saves you time, consider sponsoring development.
This repository uses a mixed licensing model:
Core engine crates and tools (e.g. tandem-core, tandem-server, tandem-types, tandem-orchestrator, and others in crates/):
MIT OR Apache-2.0 (see LICENSE and LICENSE-APACHE)Mission compiler crate (tandem-plan-compiler):
BSL-1.1)crates/tandem-plan-compiler/LICENSE for termsIn short: the runtime engine is fully open source (MIT/Apache), the mission/plan compiler is source-available under BSL.