Documentation

Getting Started Coming Soon

Codesign runs as a Docker Compose stack. Clone the repo, set your environment variables, and start the pipeline. Not yet publicly available — check back soon.

Quick Start

git clone https://github.com/atdash/bespoketracker.git
cd bespoketracker
cp .env.example .env
# Edit .env with your LLM provider API keys
docker compose up -d

Codesign will be available at http://localhost:8000. The default environment template is Startup — lean team, rapid iteration, minimal approval gates.

Architecture Overview

Codesign's core is a graph database (Kuzu) that models the relationships between product artifacts. The key entities are:

  • FeatureNode — a product feature or capability
  • SpecNode — a specification describing how a feature should work
  • WorkflowNode — an implementation plan for a spec
  • WorkCycle — an iteration unit that measures coherence over time

Each node carries versioning fields that drive the product model. When a SpecNode's version increments, the propagation engine walks the dependency graph and marks downstream WorkflowNodes as stale. The coherence report surfaces this staleness as actionable information.

Environment Templates

Templates configure pipeline gates, review cadences, and coherence thresholds. Select a template during initial setup or switch at any time:

# In .env
BESPOKE_ENV_TEMPLATE=enterprise

# Or via API
curl -X POST http://localhost:8000/api/config/template \
  -d '{"template": "enterprise"}'

LLM Configuration

Codesign routes LLM calls through a provider abstraction layer. Configure your preferred provider in .env:

# Anthropic (default)
LLM_PROVIDER=anthropic
ANTHROPIC_API_KEY=sk-ant-...

# OpenAI
LLM_PROVIDER=openai
OPENAI_API_KEY=sk-...

# Local models via Ollama
LLM_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434

Full API reference and advanced configuration are available in the GitHub repository .