rederive.dev
The build pipeline

Eight stages from specification to signed materialization.

Six pure functions, one language-model call, one cryptographic-signing step. Each stage emits structured events; tooling, CI, and the browser UI hook the stream. The platform's commitment is correctness and auditability, not transfer efficiency, not magic.

Stage 1 · pure function · milliseconds

read

Load the specification file from disk.

The pipeline starts here on every build, with no caching. Reads are cheap enough that re-reading on every build is the right discipline; the alternative would introduce cache-invalidation surface for no measurable gain.

Consumes
A path to a .constraints.md file.
Produces
The file's UTF-8 source as a string.
Substrate cost
Zero. No network, no model call.
Failure modes
File not found, permissions error, encoding error. The error message names the path; the engineer corrects the path.
Why this stage matters
The byte count is recorded on the stage event, useful for sanity checks. A zero-byte file is almost always an authoring error caught here rather than later.

Stage 2 · pure function · milliseconds

parse

Tokenize the Markdown and extract the structured requirements object.

The parser does not interpret the prose body and does not run the fenced evidence; that is the verifier's job in stage seven. The parser's contract is structural correctness — a cheap, reliable predicate to check before any expensive stage runs.

Consumes
The file's source string.
Produces
A structured object: an array of requirements (each with id, type, authority, scope, status, depends-on, prose body, fenced evidence blocks, source line) plus a manifest (provides, imports, pins).
Substrate cost
Zero.
Failure modes
Malformed metadata block, unrecognized requirement type, malformed manifest header, syntactically broken fenced block. Errors carry source-line numbers; most are typo-class.
Why this stage matters
The structured object the parser produces is the platform's working representation; every later stage operates on it rather than on the source string.

Stage 3 · pure function · milliseconds

validate

Check structural correctness above the prose layer.

Validate is the platform's semantic-correctness check above the prose. If validate passes, the platform has a coherent specification to work with downstream. If it fails, no further stage will help.

Consumes
The structured requirements object from parse.
Produces
A validation report: pass, or fail with a list of structured errors.
Substrate cost
Zero.
Failure modes
Duplicate requirement identifiers; dependency cycles in depends-on; references to nonexistent identifiers; manifest cross-references that don't resolve; unrecognized type values.
Why this stage matters
The validator does not catch missing dependencies — it cannot read the engineer's mind — but it catches typos in the dependency graph, duplicated identifiers, and bookkeeping errors that would otherwise propagate into harder-to-debug failures further down the pipeline.

Stage 4 · pure function · milliseconds

resolve

Resolve @imports directives and prepare the import context.

Resolve is where cross-file composition lives. The stage threads imported interfaces (not full code) into the prompt for the derive stage, keeping prompts under substrate output budget while preserving cross-module type resolution.

Consumes
The validated requirements object and the manifest's @imports declarations.
Produces
A resolved import context: each imported specification's content hash, parsed object, materialized code (if any), and exported interface.
Substrate cost
Zero (plus disk reads of imported files).
Failure modes
Imported file not found at the declared path; imported file's threshold property is not currently passing; pin hash mismatch (consumer pinned to a hash that no longer matches); cycle in the import graph.
Why this stage matters
Pin verification at this stage is the platform's safety net against silent drift in dependencies. For files with no manifest, this stage is a no-op and reports skip.

Stage 5 · pure function · milliseconds

canonicalize

Normalize the requirements object and compute the SHA-256 content hash.

This is the layer that gives identity to the specification. Two engineers who author the same requirements with different whitespace and metadata field order produce identical canonical bytes and identical hashes.

Consumes
The validated requirements object.
Produces
Canonical bytes (deterministic serialization with ordered fields and normalized whitespace) plus a 64-character hex SHA-256 hash.
Substrate cost
Zero.
Failure modes
None in normal operation. Canonicalization is a pure function over a validated object.
Why this stage matters
The hash flows downstream into the derive prompt context, the verify report, the provenance tuple, the cache lookup key, the wire-protocol address, and the team's audit log. Identity is content-addressed at this layer once and consistently downstream.

Stage 6 · language-model call · seconds

derive

Call the language-model substrate with the canonical specification and produce candidate code.

This is the only stage that introduces non-determinism. Two derivations against the same specification with the same substrate may produce subtly different code, both correct under verification. Verification is the contract.

Consumes
The canonical bytes (as text), the target language, the substrate handle, the resolved import context, the pin manifest.
Produces
Candidate code as a string, plus a receipt recording substrate identity, model identifier, prompt hash, and call timing.
Substrate cost
One substrate call (or several, in multi-call mode for engine-scope work).
Failure modes
Substrate API error; output budget exceeded (truncation; recovery is multi-call mode); syntactically invalid output (rare; verify will catch it); substrate refused.
Why this stage matters
The substrate handle is small and substrate-agnostic, so swapping substrates is small engineering. The platform's commitment is that the binary may differ as long as verification passes.

Stage 7 · seven backends · seconds

verify

Run the verification backends against the candidate code and produce the verdict.

This is the platform's acceptance contract. The reviewer does not inspect the candidate code directly; the reviewer inspects the verdict. Six backends are hard (a failure blocks signing). The language-model judge is soft (a failure is recorded but does not block).

Consumes
The candidate code, the requirements object, the resolved imports, the substrate handle (for the language-model judge), and the pin manifest.
Produces
A verification report: per-requirement verdicts (pass / fail / skip with evidence) and an overall verdict.
Substrate cost
Zero, unless the specification uses the language-model judge backend, in which case one substrate call per judgment block.
Failure modes
Per-backend failures: type-checker error; assertion failed; property fuzzer found a counterexample; pin not found; static accessibility rule violated; flow runner observed a divergence; language-model judge ruled fail (advisory only).
Why this stage matters
The hard / soft asymmetry is deliberate: a non-deterministic prose-evaluated check is not the kind of thing that should gate acceptance. Engineers who want a hard gate on a criterion the judge would check should encode the criterion as an assertion or property instead.

Stage 8 · cryptographic signing · milliseconds

sign

Emit a content-addressed signed materialization artifact with full provenance.

The materialization artifact is the unit of evidence the platform produces. A peer with the platform's public key can verify a materialization signature locally without contacting the platform; the artifact is cryptographically self-describing.

Consumes
The specification hash, derivation function hash, substrate identity, candidate code, verification verdict, and the active signing key.
Produces
A signed materialization, written to <filename>.materialization.json next to the source. Carries the provenance tuple plus the Ed25519 signature.
Substrate cost
Zero.
Failure modes
Signing key not configured; signing key invalid; write permission error.
Why this stage matters
A regulator, auditor, or downstream consumer who needs provenance for a deployed binary can trace from the binary through its materialization back to the specification under which it was generated, with full cryptographic integrity end to end. This is what signed provenance for AI-generated code looks like in practice.
What about failures
A fail verdict is signed too. The artifact records what failed and why; engineers can re-derive without rewriting the file, and reviewers can audit the failure history.

The pipeline runs in seconds. Your time is in the specification.

Read the architecture in full at the whitepaper, or open the platform if you have a preview password.