An agent-first orchestration layer. Bring your own model. Give it real read and write access to the tools you work in. Ship autonomous workflows without proxying tokens or rebuilding integrations.
Connect Claude, OpenAI, Gemini, and local models. Cortex routes work between them as parallel agents with full read and write access to the tools you already use.
Your data and your tokens stay on your machine. Cortex never proxies model traffic through our servers. Cloud sync is opt-in and end-to-end.
Every action a human can take in Cortex is also a structured, deterministic, idempotent operation an agent can call — same product, two grammars.
Auth, audit trails, retries, cost guardrails, and observability are first-class. Built so AI engineers can put autonomous workflows in front of customers.
- Approach
- Native desktop shell with file-system state. Lightweight runtime, no background services.
- Coordination
- Workspace lock plus shared
state.json. Agents claim and release; reviewers read the diff before commit. - Providers
- Bring your own key. Models route through your machine. The company never sees the prompt or the output.
- Distribution
- Desktop app for macOS, Windows, and Linux. Distribution details to follow before launch.
- License
- Licensing model to be announced before launch.
- Status
- Pre-launch. Dogfooded daily by the founder.