fabriqa.ai
Works with

Write the spec. Agents build it. You ship.

Run coordinated AI agent teams across Claude Code, Codex, Gemini CLI, OpenCode, and Factory Droid using your own subscriptions. No new API keys. No vendor lock-in. One workspace.

  • Define a spec once, then split work across multiple agents in parallel.
  • See handoffs, execution progress, and code diffs before you merge.
  • Switch tools and providers without losing context or history.

Not a replacement. fabriqa.ai enhances your workflow across these tools.

Dev channel release. We are building in public, so bugs can happen. Share feedback to help us improve quickly.

Download dev channel

The problem

You are already paying for AI coding tools. Why pay again?

You already use Claude Code, Codex, or Gemini CLI. The hard part is coordinated execution across multiple agents without context loss. fabriqa.ai connects the tools you already pay for into one workflow.

How it works

From spec to shipped code in three steps.

1. Define your specWrite what you want built. Bring your own specs.md or paste your own intent.
2. Agents execute in parallelSplit backend, frontend, testing, and review across tools while preserving one shared timeline.
3. Review, merge, shipInspect changes, approve what is ready, and redirect what is not.

Why developers choose fabriqa.ai

Built for real shipping workflows, not demo-only automation.

Use existing subscriptions

You pay for orchestration, not token markup. Keep your current tool subscriptions.

Spec-driven execution

Autonomous agents without a spec are chaos. fabriqa.ai keeps execution aligned.

Unified context across providers

Switch providers mid-conversation without breaking context or history. Example: Claude Code โ†” Codex, Codex โ†” Gemini CLI, Gemini CLI โ†” OpenRouter in one shared timeline.

Built on specs.md (open source) to keep specification quality at the center of delivery.

FAQ

Is it free during the dev channel?

Yes. Dev channel desktop builds are free while we iterate with early users.

Is this another editor?

No. fabriqa.ai is an orchestration layer. Keep your preferred tools and subscriptions, then coordinate them in one workflow.

Can it run with local models?

Yes. Local and offline setups are supported, including Ollama-based workflows where your environment allows it.

Which platforms are available?

Desktop builds are available for macOS, Windows, and Linux.