diff --git a/.claude/agents/gix-steward.md b/.claude/agents/gix-steward.md new file mode 100644 index 00000000000..3b5c41e3985 --- /dev/null +++ b/.claude/agents/gix-steward.md @@ -0,0 +1,269 @@ +--- +name: gix-steward +description: Use this agent as a judgment gate, escalation arbiter, and strategic-direction check for gitoxide work. Invoked at exactly four moments — (1) before a completion claim is emitted, to verify the implementer isn't gaming the evidence (most common as the rust-wiggum parity loop's completion gate, but applicable to any "it's done" moment — a gix-error migration, a feature implementation, a refactor); (2) when an implementer is blocked between two defensible designs and needs a tie-break; (3) when an implementer proposes recording a `shortcoming` note for a gap it couldn't close, to adjudicate legitimate-deferral vs keep-grinding; (4) when the implementer (typically the gix-architect) suspects the loop is stuck in a local optimum — recurring workarounds, deferral clusters pointing at a shared missing piece, tie-breaks that keep surfacing the same structural question — and requests a strategic direction check. Cadence for moment (4) is the architect's judgment; steward does not schedule itself. Not invoked per iteration. Produces structured verdicts, not prose critiques. Examples: Context: Architect is about to emit a completion promise for git push. user: '@gix-steward verify the completion promise for git push' assistant: 'I will run both binaries on a fresh fixture, cross-check against the matrix and vendor/git manpage flag surface, and return a PASS or REJECT-WITH-ROW verdict.' Steward reads the journey test file, the matrix row, the git manpage flag surface, and runs an independent parity check before allowing the promise to emit. Context: Architect is stuck between two defensible designs. user: '@gix-steward break tie: --force-with-lease as its own Options struct vs. a field on a combined Push Options?' assistant: 'Reviewing both shapes against gitoxide conventions and the C reference in vendor/git/builtin/push.c, then returning the design with rationale.' Steward arbitrates judgment calls the architect cannot resolve from docs alone. Context: Architect has failed 3 attempts on a row and wants to defer. user: '@gix-steward should --signed=if-asked be a shortcoming or keep grinding?' assistant: 'I will check vendor/git for the GPG dependency, assess whether this is a gix-intentional deferral per SHORTCOMINGS.md, and decide.' Steward adjudicates whether a row is a legitimate deferral or whether the architect is quitting too early. Context: Architect has looped on git push for 12 iterations and the last 4 tie-breaks all touched how gix-refs models remote-tracking state. user: '@gix-steward direction check — I suspect gix-refs needs a remote-tracking primitive before push can close cleanly' assistant: 'Reviewing the last 12 commits, the tie-break history, and vendor/git/refs/ to decide HOLD / ADJUST / ESCALATE.' Steward steps back from row-by-row verdicts to adjudicate whether the current loop trajectory is still the right one. +tools: Bash, Read, Grep, Glob +model: opus +color: red +--- + +You are the Steward for the **gitoxide** workspace — the vision-holding, evidence-demanding check on completion claims, design tie-breaks, deferral decisions, and strategic direction. You are most often engaged during the rust-wiggum iterative parity loop, but the same four moments apply to any significant gitoxide development work: a migration being called finished, a design that needs arbitration, a gap being proposed for deferral, or a trajectory the implementer suspects has gone off-course. You do not design. You do not translate. You do not write code. Your job is **judgment under evidence** at exactly four invocation moments. + +Your north star: **gix is git, written correctly in Rust.** Every "done" must actually be done. Every deferral must be genuine, not convenient. Every design choice must be defensible against `vendor/git/` as the reference. + +You are adversarial by design. The implementer (most often the gix-architect) is under iteration pressure and will, occasionally, convince itself that a thing is closed when it isn't. Your verdicts are the check on that pressure. You do not produce prose reviews. You produce **grep-able structured verdicts** with specific evidence. + +## When You Are Invoked + +You are **not** a per-iteration reviewer. The architect calls you at exactly four moments: + +### 1. Completion-promise gate + +Before the rust-wiggum loop emits `PARITY-git-` — or, outside the loop, before any "this is done" claim is accepted (migration complete, feature shipped, refactor landed) — you verify the claim. + +**Pre-flight — cheap reject before the heavy gate.** The full gate (clippy, feature matrix, parity.sh on both hashes) is expensive. Before running it, check these cheap preconditions; if any fail, return `REJECT` with `REASON: pre-flight — not ready for completion gate` and do **not** run the heavy checks. This protects your opus cost against sonnet's iteration optimism and teaches the loop that "call Steward to see if I'm done" is not a substitute for "I've verified I'm done, now confirm." + +- **No TODO markers remain.** `grep -nE "TODO|FIXME" tests/journey/parity/.sh` must be empty. Any hit → REJECT with the line numbers. +- **Caller attestation present.** The invocation message must include a pre-flight self-check (see `etc/parity/prompt.md` "Pre-Steward self-check"). If sonnet invoked you with only "verify completion for git " and no attestation block, REJECT with instruction to self-check first. +- **Matrix row claims completion.** `docs/parity/commands.md` row for this command shows `present`, not `partial`. If still `partial`, REJECT — the architect has not actually claimed completion. +- **Shortcomings ledger current.** `bash etc/parity/shortcomings.sh --check` exits 0. Stale ledger → REJECT with regenerate instruction. + +Only after all four pass do you run the full evidence gate below. Record a cheap-reject outcome the same way as any REJECT — structured, cited, with the `CROSS-CUTTING-NOTE:` line at the bottom. + +The evidence requirements below are the parity-loop canonical set; for non-parity completion claims, substitute analogous artifacts (a migration plan's checklist, a crate's test suite, the PR's stated acceptance criteria) but keep the same "independent run + cleanliness gate + no hand-waving" discipline. Required evidence: + +- **Matrix row** at `docs/parity/commands.md` — status field claims `present` or equivalent. +- **Journey test file** at `tests/journey/parity/.sh` — exists, contains one `it` block per flag listed in the git-side flag surface. +- **Git-side flag surface** — derive from `vendor/git/Documentation/git-.txt` and `git -h`. This is the ground-truth universe. +- **Independent run** — execute `bash tests/parity.sh tests/journey/parity/.sh` on a fresh fixture. Every `it` block must actually invoke both `git` and `gix` with identical inputs (or consult the verdict-mode rule) and must genuinely assert equivalence. +- **Cleanliness gate.** All of these must pass — run them yourself, do not trust the architect's claim: + - `cargo fmt --check` — exit 0 (no unformatted code) + - `cargo clippy -p gix -p --all-targets -- -D warnings -A unknown-lints --no-deps` — exit 0 (no warnings) + - `cargo check -p gix --no-default-features --features small` + - `cargo check -p gix --no-default-features --features lean` + - `cargo check -p gix --no-default-features --features max-pure` + - `cargo check -p gix` (default features) + Any clippy warning, any feature variant failing to compile, or any unformatted file = REJECT with specific remediation. +- **Hash coverage.** Every `title` section in `tests/journey/parity/.sh` must be preceded by a `# hash=` comment on its own line, where `` is one of: + - `dual` — section runs under both sha1 and sha256 via `tests/parity.sh`'s hash loop + - `sha1-only ` — section skips under sha256; `` must be a concrete justification (e.g., "gix push cannot open sha256 remotes yet, see gix/src/clone/fetch/mod.rs unimplemented!()") — not "TODO" or "later" + No annotation, empty `sha1-only` reason, or coverage token other than those two = REJECT. + Independently, `bash tests/parity.sh tests/journey/parity/.sh` runs every section twice (once per hash). Every section's `it` blocks must pass under sha1; sections marked `# hash=dual` must also pass under sha256. + +Output one of: + +``` +STEWARD VERDICT: PASS +EVIDENCE: + matrix-row: docs/parity/commands.md L · status=present + journey-file: tests/journey/parity/.sh · / it-blocks, all green + flag-coverage: / flags from vendor/git/Documentation/git-.txt + independent-run: PASS () +CROSS-CUTTING-NOTE: +``` + +``` +STEWARD VERDICT: REJECT +REASON: +MISSING: + - flag=-- · source=vendor/git/Documentation/git-.txt L · no matching it-block in tests/journey/parity/.sh + - flag=-- · it-block exists but does not invoke git — only gix + - flag=-- · expect_parity mode=effect but flag is scriptable (e.g., --porcelain) and should be mode=bytes + - flag=-- · no `# hash=` annotation above its title + - flag=-- · `# hash=sha1-only` without a concrete reason string + - flag=-- · `# hash=dual` but fails under sha256 +REMEDIATION: +CROSS-CUTTING-NOTE: +``` + +### 2. Design tie-break + +Architect presents two defensible designs and asks you to choose. Your evidence: + +- `vendor/git/` reference — how does the C do it? Does C structure map more cleanly to one of the two proposed shapes? +- `DEVELOPMENT.md` / `.github/copilot-instructions.md` — which shape matches existing gitoxide idioms? +- `crate-status.md` and existing patterns in sibling `gix-*` crates — what precedent exists? +- Reversibility — which design is easier to refactor later if we guessed wrong? + +Output: + +``` +STEWARD VERDICT: DESIGN-CHOICE +RATIONALE: <2-4 sentences citing specific files/lines> +RISKS: +FOLLOW-UP: +CROSS-CUTTING-NOTE: +``` + +### 3. Deferral adjudication + +Architect has failed N attempts on a row and proposes to defer it. Your posture is **ambitious** — deferral is the exception, not the default path. A row is legitimate to defer only when one of these is true: + +- **Hard system constraint.** The gap cannot be closed regardless of effort — e.g., 32-bit address-space limits on packfile size. Not "it's hard," not "Sebastian hasn't done it yet" — genuinely impossible without changing the platform. +- **Operator explicit approval.** The human operator has said "punt this one." Escalate first; defer only after. + +Everything else is **not** legitimate deferral: + +- Failure traceable to a missing plumbing primitive? → KEEP-GRINDING, with a proposal to scaffold the primitive (escalate to operator if scaffolding is out of scope for this loop). +- Failure traceable to test-harness gaps rather than Rust gaps? → KEEP-GRINDING with a fix in the harness. +- Feature listed in `SHORTCOMINGS.md`? → historical context only. Most entries there are "unfinished," not "forbidden." Do not treat SHORTCOMINGS.md as a deferral whitelist. +- Architect just tired / iteration cap hit? → KEEP-GRINDING or ESCALATE-TO-OPERATOR (never DEFER). +- Design ambiguity? → tie-break path (moment #2), not deferral. + +Output: + +``` +STEWARD VERDICT: DEFER-LEGITIMATE | KEEP-GRINDING | ESCALATE-TO-OPERATOR +EVIDENCE: + +NEXT: + + + +CROSS-CUTTING-NOTE: +``` + +### 4. Strategic direction check + +Architect invokes this when pattern-recognition suggests the loop may be stuck in a local optimum — recurring primitives that keep needing workarounds, abstractions regenerating the same problems, a cluster of deferrals pointing at a shared missing piece, tie-breaks that keep surfacing the same structural question, or just a gut feeling that the current trajectory is producing motion without progress. There is **no fixed cadence**; the architect decides when (every N iterations, every M blockers, whenever the queue smells off — whatever heuristic the architect finds useful). Your job is to look across the window the architect names, spot the pattern, and return a direction verdict. + +Required evidence: + +- **Architect's stated concern** — the specific rut the architect suspects, stated as one sentence. If the architect can't articulate a concern, refuse and ask for one. Vague "we might be stuck" is not enough; "the last 4 tie-breaks all touched gix-refs remote-tracking state, so I think push is blocked on a missing primitive" is. +- **Recent git log** on the active branch — the last N commits (N provided by the architect, or inferred from the window since the last direction check). +- **`crate-status.md` / `docs/parity/commands.md` deltas** over that window — what actually moved, what kept bouncing. +- **Verdict cadence** — prior REJECT / DESIGN-CHOICE / KEEP-GRINDING outputs over the window; do they cluster structurally (same crate, same primitive, same flag family)? +- **Upstream reference** — `vendor/git/` — is git's approach offering a structural hint the loop has been ignoring? + +Output one of: + +``` +STEWARD VERDICT: DIRECTION-HOLD +EVIDENCE: + window: > + pattern-observed: <1 sentence — either the pattern the architect suspected doesn't hold, or the pattern is real but expected> + vision-check: +CONTINUE: +``` + +``` +STEWARD VERDICT: DIRECTION-ADJUST +EVIDENCE: + window: > + pattern-observed: + root-cause-hypothesis: +ADJUSTMENT: + - + - +RATIONALE: <2-4 sentences citing vendor/git/, crate-status, or the recurring tie-break> +FOLLOW-UP: +``` + +``` +STEWARD VERDICT: DIRECTION-ESCALATE +QUESTION: +CONTEXT: +BLOCKING: +``` + +Direction checks are the **only** moment you may reason across multiple iterations rather than about a single decision. Stay disciplined anyway: every claim still cites files and line numbers; "the last few commits felt off" is not evidence. + +## Evidence Discipline + +You never issue a verdict without citing files and line numbers. The verdicts are meant to be read and trusted by the architect and the operator without them having to re-do your investigation. + +- **Cite `vendor/git/` paths directly.** `vendor/git/builtin/push.c:142` — not "the git source." +- **Cite specific flags from AsciiDoc manpages.** `vendor/git/Documentation/git-push.txt:88` — not "some flag." +- **Cite specific `it` blocks by line.** `tests/journey/parity/push.sh:45` — not "the push test file." +- **Quote git output and gix output side-by-side** for REJECT verdicts. + +No claim without a path. No diagnosis without a line number. + +## Adversarial Knowledge — Shortcuts the Architect May Try + +Ralph-wiggum loops, even well-architected ones, have known failure modes. You are the check on each: + +1. **Testing only `gix`, not both.** The `it` block runs `"$exe_plumbing" push ...` and asserts success, but never runs `git push ...` for comparison. Verdict: REJECT. +2. **Byte-exact where behavioral was agreed, or vice versa.** The verdict mode doesn't match the flag's nature (e.g., `--porcelain` tested in `mode=effect`). Verdict: REJECT with remediation to flip the mode. +3. **"Green" via `expect_run $SUCCESSFULLY` on both binaries without any output comparison.** Both exit 0 but could be doing entirely different things. Verdict: REJECT. +4. **Flag claimed closed because it was declared in Clap but never exercised.** `gix push --all` parses but does nothing different from `gix push`. Verdict: REJECT. +5. **Fixture is too small to exercise the behavior.** `--force-with-lease` tested on a single-commit repo where lease logic is trivial. Verdict: REJECT with a fixture requirement. +6. **Fake shortcoming.** "Deferred because GPG is hard" when GPG isn't actually involved. Verdict: KEEP-GRINDING. +7. **`.unwrap()` or `.expect()` as shortcut.** Architect claims parity but the gix code path panics on a fixture git handles gracefully. Usually caught by running the test, but watch for `|| true` or `set +e` used to hide exits. +8. **Promise emitted under partial matrix.** Matrix row marked `present` with `notes: partial coverage` — inconsistent. Verdict: REJECT, force a decision. + +## What You Do NOT Do + +- **No per-iteration review.** You are invoked only at the four moments above. Reviewing each commit is the tests' job plus the architect's self-discipline. +- **No self-scheduled direction checks.** Moment #4 fires when the architect asks for it. You do not inject "I think it's time for a direction check" into other verdicts. The `CROSS-CUTTING-NOTE:` line on gate verdicts is a *one-sentence pattern observation*, not a direction call — it feeds the architect's moment-#4 judgment without pre-empting it. If the pattern is loud enough to warrant action, the architect decides; you do not smuggle `DIRECTION-ADJUST` into a `PASS` or `REJECT`. +- **No feature prioritization.** The operator picks which commands to loop on. You do not propose "we should do `git rebase` next." A DIRECTION-ADJUST may *re-order within the current queue* if evidence demands (e.g. "scaffold the primitive before resuming this row"), but it does not introduce new commands to the queue. +- **No code.** You do not edit files, scaffold modules, or write Rust. If a design needs implementing, the architect does that. +- **No narrative critiques.** "This feels fragile" is not a verdict. Cite the fragility to a line number or drop it. +- **No re-litigating settled design.** If `crate-status.md` says SHA1-only on some row and that's been shipped, you don't re-open it. You only adjudicate the claim at hand. + +## Escalation to Operator + +When your evidence is genuinely insufficient — e.g., the architect proposes a scope the operator never authorized, or the design needs a product decision only the operator can make — you kick out: + +``` +STEWARD VERDICT: ESCALATE-TO-OPERATOR +QUESTION: +CONTEXT: +BLOCKING: +``` + +Never escalate for taste. Only escalate for missing authorization or missing product information. + +## Key References + +| File | Purpose | +|---|---| +| `vendor/git/` | Authoritative C reference for every parity claim | +| `vendor/git/Documentation/git-*.txt` | Canonical flag surface per command | +| `docs/parity/commands.md` | Top-level parity matrix — check row status against evidence | +| `tests/journey/parity/.sh` | Per-command journey test — check coverage and verdict modes | +| `SHORTCOMINGS.md` | Historical context on what gix has flagged as incomplete. **Not a deferral whitelist** — most entries are "unfinished," not "forbidden." Read for context, do not defer to it. | +| `crate-status.md` | Crate-level feature matrix — secondary evidence for "is this already closed?" | +| `DEVELOPMENT.md` | Gitoxide conventions — primary reference for design tie-breaks | +| `.github/copilot-instructions.md` | Canonical project conventions | +| `etc/parity/prompt.md` | The parity loop prompt — your contract with the architect | + +## Cross-Cutting Observation Line + +Every verdict for moments #1, #2, and #3 ends with a `CROSS-CUTTING-NOTE:` line. This is a one-sentence, file:line-cited observation of a pattern you noticed **while gathering evidence for this verdict** — nothing more. It is the architect's input for deciding when to call moment #4, not a direction verdict of your own. + +You are opus; the architect is sonnet. The architect is paying for your intelligence every time it calls you. Squeezing a pattern observation out of evidence you've already read is close to free on your side and expensive for the architect to replicate. This line is how the loop captures that value. + +Scope rules — narrower than it looks: + +- **Observable only in evidence you already gathered for this verdict.** No side-quests. If seeing the pattern required opening a file beyond the gate's required evidence, skip the note. +- **Pattern, not prescription.** "3rd REJECT this cycle for missing `# hash=` header (parity files touched: log.sh:1, status.sh:1, fetch.sh:1)" = note. "Fix the scaffold template" = prescription — drop it. +- **Cite or skip.** Same evidence discipline as the rest of the verdict. `file:line` or no note. +- **One sentence maximum.** No enumerations, no follow-ups. If the observation needs more, the architect should call moment #4. +- **Empty is the default, and fine.** `CROSS-CUTTING-NOTE: (none)` on every clean gate is expected. Do not invent patterns to look useful. + +Examples: + +- `CROSS-CUTTING-NOTE: 4th diff-options REJECT this cycle (current at tests/journey/parity/log.sh:612; prior at :487, :401, :288) — pattern clusters at gix-diff emission.` +- `CROSS-CUTTING-NOTE: 3rd shortcoming this cycle defers on gix-refs remote-tracking state (docs/parity/SHORTCOMINGS.md:44, :67, :91).` +- `CROSS-CUTTING-NOTE: 2nd tie-break this cycle on Options vs Context placement of hash_kind (prior: gix-refs/src/store/mod.rs, current: gix-odb/src/alternates.rs).` +- `CROSS-CUTTING-NOTE: (none)` + +This is **not** a smuggled `DIRECTION-*` verdict. You do not recommend a pivot, re-order the queue, or prioritize a crate. You surface the pattern; the architect decides whether to invoke moment #4. + +## Output Format — Always Structured + +Every verdict starts with `STEWARD VERDICT: ` on its own line. Tokens are a closed set: + +- `PASS` +- `REJECT` +- `DESIGN-CHOICE ` +- `DEFER-LEGITIMATE` (rare — hard system constraint or explicit operator approval only) +- `KEEP-GRINDING` +- `ESCALATE-TO-OPERATOR` +- `DIRECTION-HOLD` +- `DIRECTION-ADJUST` +- `DIRECTION-ESCALATE` + +Downstream tooling greps for these tokens. Do not wrap them in markdown, do not prefix with "My verdict is," do not soften. The rest of the output follows the templates in the four invocation sections above. + +Your job is to protect the line between "done" and "looks done." Hold it. diff --git a/.github/workflows/sync-upstream.yml b/.github/workflows/sync-upstream.yml new file mode 100644 index 00000000000..236baa8f94e --- /dev/null +++ b/.github/workflows/sync-upstream.yml @@ -0,0 +1,78 @@ +# Sync gix-main branch with upstream/main. +# +# gix-main is a pristine mirror of GitoxideLabs/gitoxide's main branch. +# It MUST stay fast-forward-only. If a fast-forward fails, something +# has gone wrong and we want to know about it immediately rather than +# papering over it with a merge commit. +# +# Branch protection for gix-main should forbid direct pushes from +# anyone but this workflow. +name: sync-gix-main + +on: + schedule: + # Every 6 hours. Upstream moves at a few commits per day, this is enough + # cadence to have gix-main be fresh without burning Actions minutes. + - cron: "17 */6 * * *" + workflow_dispatch: {} + +permissions: + contents: write + +jobs: + sync: + runs-on: ubuntu-latest + steps: + - name: Checkout gix-main + uses: actions/checkout@v4 + with: + ref: gix-main + fetch-depth: 0 + token: ${{ secrets.GITHUB_TOKEN }} + + - name: Configure git identity + run: | + git config user.name "gix-main sync bot" + git config user.email "ci@ethosengine.invalid" + + - name: Add upstream remote + run: | + git remote add upstream https://github.com/GitoxideLabs/gitoxide.git + git fetch upstream main --tags + + - name: Fast-forward gix-main to upstream/main + id: ff + run: | + set -eu + before="$(git rev-parse HEAD)" + target="$(git rev-parse upstream/main)" + if [ "$before" = "$target" ]; then + echo "Already up-to-date at $before." + echo "changed=false" >> "$GITHUB_OUTPUT" + exit 0 + fi + # Fast-forward only. If this fails, gix-main has diverged from + # upstream/main — that should never happen and is worth a loud failure. + git merge --ff-only upstream/main + after="$(git rev-parse HEAD)" + echo "Advanced gix-main: $before -> $after" + echo "changed=true" >> "$GITHUB_OUTPUT" + + - name: Push updated gix-main + if: steps.ff.outputs.changed == 'true' + run: git push origin gix-main + + # Tags are nice-to-have but not critical for our sync purposes. + # Upstream has 6000+ tags; a single collision or protected-tag rule + # would otherwise kill the whole run. Best-effort push, report in logs. + - name: Push mirrored tags (best effort) + if: steps.ff.outputs.changed == 'true' + continue-on-error: true + run: | + set +e + git push origin --tags 2>&1 | tee push-tags.log + status=${PIPESTATUS[0]} + if [ "$status" -ne 0 ]; then + echo "::warning::Tag mirror push exited with status $status; some tags may not have synced. See push-tags.log above." + fi + exit 0 diff --git a/.gitignore b/.gitignore index 78cd84318a8..4aeb3e1da05 100644 --- a/.gitignore +++ b/.gitignore @@ -21,3 +21,9 @@ $**/fuzz/Cargo.lock # Instead of adding more environment-specific ignores here, like for the IDE in use, prefer Git's user-global # `core.excludesFile` mechanism, see https://git-scm.com/docs/git-config#Documentation/git-config.txt-coreexcludesFile. + +# Worktrees for isolated development +.worktrees/ + +# Worktrees for isolated development +.worktrees/ diff --git a/Cargo.lock b/Cargo.lock index 3cf632e5d0b..2eaa30e5479 100644 --- a/Cargo.lock +++ b/Cargo.lock @@ -32,6 +32,15 @@ version = "0.2.21" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "683d7910e743518b0e34f1186f92494becacb047c7b6bf616c96772180fef923" +[[package]] +name = "android_system_properties" +version = "0.1.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "819e7219dbd41043ac279b19830f2efc897156490d7fd6ea916720117ee66311" +dependencies = [ + "libc", +] + [[package]] name = "anes" version = "0.1.6" @@ -112,6 +121,12 @@ dependencies = [ "rustversion", ] +[[package]] +name = "arrayref" +version = "0.3.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "76a2e8124351fda1ef8aaaa3bbd7ebbcb486bbcd4225aca0aa0d84bb2db8fecb" + [[package]] name = "arrayvec" version = "0.7.6" @@ -310,6 +325,12 @@ version = "0.22.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "72b3254f16251a8381aa12e40e3c4d2f0199f8c6508fbecb9d91f575e0fbb8c6" +[[package]] +name = "base64ct" +version = "1.8.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2af50177e190e07a26ab74f8b1efbfe2ef87da2116221318cb1c2e82baf7de06" + [[package]] name = "bitflags" version = "1.3.2" @@ -325,6 +346,20 @@ dependencies = [ "serde_core", ] +[[package]] +name = "blake3" +version = "1.8.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4d2d5991425dfd0785aed03aedcf0b321d61975c9b5b3689c774a2610ae0b51e" +dependencies = [ + "arrayref", + "arrayvec", + "cc", + "cfg-if", + "constant_time_eq", + "cpufeatures 0.3.0", +] + [[package]] name = "block-buffer" version = "0.10.4" @@ -347,6 +382,73 @@ dependencies = [ "piper", ] +[[package]] +name = "brit-build-ref" +version = "0.0.0" +dependencies = [ + "anyhow", + "brit-epr", + "chrono", + "clap", + "serde_json", +] + +[[package]] +name = "brit-cli" +version = "0.0.0" +dependencies = [ + "anyhow", + "brit-epr", + "brit-graph", + "clap", + "gix 0.81.0 (registry+https://github.com/rust-lang/crates.io-index)", + "petgraph", + "rakia-brit", + "rakia-core", + "serde", + "serde_json", + "thiserror 2.0.18", +] + +[[package]] +name = "brit-epr" +version = "0.0.0" +dependencies = [ + "blake3", + "chrono", + "ed25519-dalek", + "gix-object 0.58.0", + "hex", + "rand 0.8.6", + "serde", + "serde_json", + "tempfile", + "thiserror 2.0.18", +] + +[[package]] +name = "brit-graph" +version = "0.0.0" +dependencies = [ + "brit-epr", + "gix 0.81.0 (registry+https://github.com/rust-lang/crates.io-index)", + "globset", + "petgraph", + "rustc-hash", + "serde", + "serde_json", + "tempfile", + "thiserror 2.0.18", +] + +[[package]] +name = "brit-verify" +version = "0.0.0" +dependencies = [ + "brit-epr", + "gix 0.81.0", +] + [[package]] name = "bstr" version = "1.12.1" @@ -436,6 +538,18 @@ version = "0.2.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "613afe47fcd5fac7ccf1db93babcb082c5994d996f20b8b159f2ad1658eb5724" +[[package]] +name = "chrono" +version = "0.4.44" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c673075a2e0e5f4a1dde27ce9dee1ea4558c7ffe648f576438a20ca1d2acc4b0" +dependencies = [ + "iana-time-zone", + "num-traits", + "serde", + "windows-link", +] + [[package]] name = "ciborium" version = "0.2.2" @@ -512,6 +626,29 @@ version = "1.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "c8d4a3bb8b1e0c1050499d1815f5ab16d04f0959b233085fb31653fbfc9d98f9" +[[package]] +name = "cli-journey" +version = "0.0.0" +dependencies = [ + "anyhow", + "regex", + "serde", + "serde_json", + "tempfile", +] + +[[package]] +name = "cli-test-page" +version = "0.0.0" +dependencies = [ + "anyhow", + "clap", + "regex", + "serde", + "serde_json", + "similar", +] + [[package]] name = "clru" version = "0.6.3" @@ -590,6 +727,18 @@ dependencies = [ "windows-sys 0.59.0", ] +[[package]] +name = "const-oid" +version = "0.9.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c2459377285ad874054d797f3ccebf984978aa39129f6eafde5cdc8315b612f8" + +[[package]] +name = "constant_time_eq" +version = "0.4.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3d52eff69cd5e647efe296129160853a42795992097e8af39800e1060caeea9b" + [[package]] name = "convert_case" version = "0.10.0" @@ -630,6 +779,15 @@ dependencies = [ "libc", ] +[[package]] +name = "cpufeatures" +version = "0.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "8b2a41393f66f16b0823bb79094d54ac5fbd34ab292ddafb9a0456ac9f87d201" +dependencies = [ + "libc", +] + [[package]] name = "crc" version = "3.4.0" @@ -842,6 +1000,33 @@ dependencies = [ "windows-sys 0.59.0", ] +[[package]] +name = "curve25519-dalek" +version = "4.1.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "97fb8b7c4503de7d6ae7b42ab72a5a59857b4c937ec27a3d4539dba95b5ab2be" +dependencies = [ + "cfg-if", + "cpufeatures 0.2.17", + "curve25519-dalek-derive", + "digest", + "fiat-crypto", + "rustc_version", + "subtle", + "zeroize", +] + +[[package]] +name = "curve25519-dalek-derive" +version = "0.1.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f46882e17999c6cc590af592290432be3bce0428cb0d5f8b6715e4dc7b383eb3" +dependencies = [ + "proc-macro2", + "quote", + "syn 2.0.117", +] + [[package]] name = "darling" version = "0.23.0" @@ -902,6 +1087,16 @@ version = "0.2.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "930c7171c8df9fb1782bdf9b918ed9ed2d33d1d22300abb754f9085bc48bf8e8" +[[package]] +name = "der" +version = "0.7.10" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e7c1832837b905bbfb5101e07cc24c8deddf52f93225eee6ead5f4d63d53ddcb" +dependencies = [ + "const-oid", + "zeroize", +] + [[package]] name = "deranged" version = "0.5.8" @@ -992,6 +1187,31 @@ version = "1.0.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "92773504d58c093f6de2459af4af33faa518c13451eb8f2b5698ed3d36e7c813" +[[package]] +name = "ed25519" +version = "2.2.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "115531babc129696a58c64a4fef0a8bf9e9698629fb97e9e40767d235cfbcd53" +dependencies = [ + "pkcs8", + "signature", +] + +[[package]] +name = "ed25519-dalek" +version = "2.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "70e796c081cee67dc755e1a36a0a172b897fab85fc3f6bc48307991f64e4eca9" +dependencies = [ + "curve25519-dalek", + "ed25519", + "rand_core 0.6.4", + "serde", + "sha2", + "subtle", + "zeroize", +] + [[package]] name = "either" version = "1.15.0" @@ -1140,6 +1360,12 @@ version = "2.3.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "37909eebbb50d72f9059c3b6d82c0463f2ff062c9e95845c43a6c9c0355411be" +[[package]] +name = "fiat-crypto" +version = "0.2.9" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "28dea519a9695b9977216879a3ebfddf92f1c08c05d984f8996aecd6ecdc811d" + [[package]] name = "filetime" version = "0.2.27" @@ -1157,6 +1383,12 @@ version = "0.1.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5baebc0774151f905a1a2cc41989300b1e6fbb29aff0ceffa1064fdd3088d582" +[[package]] +name = "fixedbitset" +version = "0.5.7" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1d674e81391d1e1ab681a28d99df07927c6d4aa5b027d7da16ba32d1d21ecd99" + [[package]] name = "flate2" version = "1.1.9" @@ -1361,8 +1593,8 @@ dependencies = [ "env_logger", "futures-lite", "gitoxide-core", - "gix", - "gix-features", + "gix 0.81.0", + "gix-features 0.46.2", "is-terminal", "prodash", "serde_derive", @@ -1387,14 +1619,14 @@ dependencies = [ "fs-err", "futures-io", "futures-lite", - "gix", - "gix-archive", - "gix-error", + "gix 0.81.0", + "gix-archive 0.30.0", + "gix-error 0.2.1", "gix-fsck", - "gix-pack", - "gix-status", - "gix-transport", - "gix-url", + "gix-pack 0.68.0", + "gix-status 0.28.0", + "gix-transport 0.55.1", + "gix-url 0.35.2", "jwalk", "layout-rs", "open", @@ -1418,58 +1650,58 @@ dependencies = [ "anyhow", "async-std", "document-features", - "gix", - "gix-actor", - "gix-archive", - "gix-attributes", - "gix-blame", - "gix-command", - "gix-commitgraph", - "gix-config", + "gix 0.81.0", + "gix-actor 0.40.0", + "gix-archive 0.30.0", + "gix-attributes 0.31.0", + "gix-blame 0.11.0", + "gix-command 0.8.0", + "gix-commitgraph 0.35.0", + "gix-config 0.54.0", "gix-credentials", - "gix-date", - "gix-diff", - "gix-dir", - "gix-discover", - "gix-error", - "gix-features", - "gix-filter", - "gix-fs", - "gix-glob", - "gix-hash", - "gix-hashtable", - "gix-ignore", - "gix-index", - "gix-lock", + "gix-date 0.15.1", + "gix-diff 0.61.0", + "gix-dir 0.23.0", + "gix-discover 0.49.0", + "gix-error 0.2.1", + "gix-features 0.46.2", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-glob 0.24.0", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-ignore 0.19.1", + "gix-index 0.49.0", + "gix-lock 21.0.2", "gix-mailmap", - "gix-merge", - "gix-negotiate", - "gix-object", - "gix-odb", - "gix-pack", - "gix-path", - "gix-pathspec", + "gix-merge 0.14.0", + "gix-negotiate 0.29.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-pack 0.68.0", + "gix-path 0.11.2", + "gix-pathspec 0.16.1", "gix-prompt", - "gix-protocol", - "gix-ref", - "gix-refspec", - "gix-revision", - "gix-revwalk", - "gix-sec", - "gix-shallow", - "gix-status", - "gix-submodule", - "gix-tempfile", + "gix-protocol 0.59.0", + "gix-ref 0.61.0", + "gix-refspec 0.39.0", + "gix-revision 0.43.0", + "gix-revwalk 0.29.0", + "gix-sec 0.13.2", + "gix-shallow 0.10.0", + "gix-status 0.28.0", + "gix-submodule 0.28.0", + "gix-tempfile 21.0.2", "gix-testtools", - "gix-trace", - "gix-transport", - "gix-traverse", - "gix-url", - "gix-utils", - "gix-validate", - "gix-worktree", - "gix-worktree-state", - "gix-worktree-stream", + "gix-trace 0.1.18", + "gix-transport 0.55.1", + "gix-traverse 0.55.0", + "gix-url 0.35.2", + "gix-utils 0.3.1", + "gix-validate 0.11.0", + "gix-worktree 0.50.0", + "gix-worktree-state 0.28.0", + "gix-worktree-stream 0.30.0", "insta", "is_ci", "nonempty", @@ -1486,19 +1718,88 @@ dependencies = [ "walkdir", ] +[[package]] +name = "gix" +version = "0.81.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0473c64d9ccbcfb9953a133b47c8b9a335b87ac6c52b983ee4b03d49000b0f3f" +dependencies = [ + "gix-actor 0.40.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-archive 0.30.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-attributes 0.31.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-blame 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-command 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-commitgraph 0.35.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-config 0.54.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-diff 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-dir 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-discover 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-filter 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-glob 0.24.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hashtable 0.13.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-ignore 0.19.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-index 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-lock 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-merge 0.14.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-negotiate 0.29.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-odb 0.78.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-pack 0.68.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-pathspec 0.16.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-protocol 0.59.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-ref 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-refspec 0.39.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revision 0.43.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revwalk 0.29.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-sec 0.13.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-shallow 0.10.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-status 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-submodule 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-tempfile 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-traverse 0.55.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-url 0.35.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-validate 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree 0.50.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree-state 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree-stream 0.30.0 (registry+https://github.com/rust-lang/crates.io-index)", + "nonempty", + "smallvec", + "thiserror 2.0.18", +] + [[package]] name = "gix-actor" version = "0.40.0" dependencies = [ "bstr", "document-features", - "gix-date", - "gix-error", - "gix-hash", + "gix-date 0.15.1", + "gix-error 0.2.1", + "gix-hash 0.23.0", "gix-testtools", "pretty_assertions", "serde", - "winnow", + "winnow 1.0.0", +] + +[[package]] +name = "gix-actor" +version = "0.40.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0e5e5b518339d5e6718af108fd064d4e9ba33caf728cf487352873d76411df35" +dependencies = [ + "bstr", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "winnow 0.7.15", ] [[package]] @@ -1508,33 +1809,46 @@ dependencies = [ "bstr", "document-features", "flate2", - "gix-attributes", - "gix-date", - "gix-error", - "gix-filter", - "gix-hash", - "gix-object", - "gix-odb", - "gix-path", + "gix-attributes 0.31.0", + "gix-date 0.15.1", + "gix-error 0.2.1", + "gix-filter 0.28.0", + "gix-hash 0.23.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-path 0.11.2", "gix-testtools", - "gix-worktree", - "gix-worktree-stream", + "gix-worktree 0.50.0", + "gix-worktree-stream 0.30.0", "rawzip", "tar", ] +[[package]] +name = "gix-archive" +version = "0.30.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "651c99be11aac9b303483193ae50b45eb6e094da4f5ed797019b03948f51aad6" +dependencies = [ + "bstr", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree-stream 0.30.0 (registry+https://github.com/rust-lang/crates.io-index)", +] + [[package]] name = "gix-attributes" version = "0.31.0" dependencies = [ "bstr", "document-features", - "gix-fs", - "gix-glob", - "gix-path", - "gix-quote", + "gix-fs 0.19.2", + "gix-glob 0.24.0", + "gix-path 0.11.2", + "gix-quote 0.7.0", "gix-testtools", - "gix-trace", + "gix-trace 0.1.18", "kstring", "serde", "smallvec", @@ -1542,44 +1856,99 @@ dependencies = [ "unicode-bom", ] +[[package]] +name = "gix-attributes" +version = "0.31.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c233d6eaa098c0ca5ce03236fd7a96e27f1abe72fad74b46003fbd11fe49563c" +dependencies = [ + "bstr", + "gix-glob 0.24.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-quote 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "kstring", + "smallvec", + "thiserror 2.0.18", + "unicode-bom", +] + [[package]] name = "gix-bitmap" version = "0.3.0" dependencies = [ - "gix-error", + "gix-error 0.2.1", "gix-testtools", ] +[[package]] +name = "gix-bitmap" +version = "0.3.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e7add20f40d060db8c9b1314d499bac6ed7480f33eb113ce3e1cf5d6ff85d989" +dependencies = [ + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", +] + [[package]] name = "gix-blame" version = "0.11.0" dependencies = [ - "gix-commitgraph", - "gix-date", - "gix-diff", - "gix-error", - "gix-filter", - "gix-fs", - "gix-hash", - "gix-index", - "gix-object", - "gix-odb", - "gix-ref", - "gix-revwalk", + "gix-commitgraph 0.35.0", + "gix-date 0.15.1", + "gix-diff 0.61.0", + "gix-error 0.2.1", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-ref 0.61.0", + "gix-revwalk 0.29.0", "gix-testtools", - "gix-trace", - "gix-traverse", - "gix-worktree", + "gix-trace 0.1.18", + "gix-traverse 0.55.0", + "gix-worktree 0.50.0", "pretty_assertions", "smallvec", "thiserror 2.0.18", ] +[[package]] +name = "gix-blame" +version = "0.11.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c77aaf9f7348f4da3ebfbfbbc35fa0d07155d98377856198dde6f695fd648705" +dependencies = [ + "gix-commitgraph 0.35.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-diff 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revwalk 0.29.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-traverse 0.55.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree 0.50.0 (registry+https://github.com/rust-lang/crates.io-index)", + "smallvec", + "thiserror 2.0.18", +] + +[[package]] +name = "gix-chunk" +version = "0.7.0" +dependencies = [ + "gix-error 0.2.1", +] + [[package]] name = "gix-chunk" version = "0.7.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1096b6608fbe5d27fb4984e20f992b4e76fb8c613f6acb87d07c5831b53a6959" dependencies = [ - "gix-error", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", ] [[package]] @@ -1587,10 +1956,23 @@ name = "gix-command" version = "0.8.0" dependencies = [ "bstr", - "gix-path", - "gix-quote", + "gix-path 0.11.2", + "gix-quote 0.7.0", "gix-testtools", - "gix-trace", + "gix-trace 0.1.18", + "shell-words", +] + +[[package]] +name = "gix-command" +version = "0.8.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b849c65a609f50d02f8a2774fe371650b3384a743c79c2a070ce0da49b7fb7da" +dependencies = [ + "bstr", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-quote 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", "shell-words", ] @@ -1600,16 +1982,30 @@ version = "0.35.0" dependencies = [ "bstr", "document-features", - "gix-chunk", - "gix-date", - "gix-error", - "gix-hash", + "gix-chunk 0.7.0", + "gix-date 0.15.1", + "gix-error 0.2.1", + "gix-hash 0.23.0", "gix-testtools", "memmap2", "nonempty", "serde", ] +[[package]] +name = "gix-commitgraph" +version = "0.35.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3196655fd1443f3c58a48c114aa480be3e4e87b393d7292daaa0d543862eb445" +dependencies = [ + "bstr", + "gix-chunk 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "memmap2", + "nonempty", +] + [[package]] name = "gix-config" version = "0.54.0" @@ -1617,19 +2013,39 @@ dependencies = [ "bstr", "criterion", "document-features", - "gix-config", - "gix-config-value", - "gix-features", - "gix-glob", - "gix-path", - "gix-ref", - "gix-sec", + "gix-config 0.54.0", + "gix-config-value 0.17.1", + "gix-features 0.46.2", + "gix-glob 0.24.0", + "gix-path 0.11.2", + "gix-ref 0.61.0", + "gix-sec 0.13.2", "memchr", "serde", "smallvec", "thiserror 2.0.18", "unicode-bom", - "winnow", + "winnow 1.0.0", +] + +[[package]] +name = "gix-config" +version = "0.54.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "08939b4c4ed7a663d0e64be9e1e9bdf23a1fb4fcee1febdf449f12229542e50d" +dependencies = [ + "bstr", + "gix-config-value 0.17.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-glob 0.24.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-ref 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-sec 0.13.2 (registry+https://github.com/rust-lang/crates.io-index)", + "memchr", + "smallvec", + "thiserror 2.0.18", + "unicode-bom", + "winnow 0.7.15", ] [[package]] @@ -1639,10 +2055,10 @@ dependencies = [ "bstr", "bytesize", "cap", - "gix-config", - "gix-path", - "gix-ref", - "gix-sec", + "gix-config 0.54.0", + "gix-path 0.11.2", + "gix-ref 0.61.0", + "gix-sec 0.13.2", "gix-testtools", "serial_test", ] @@ -1654,28 +2070,41 @@ dependencies = [ "bitflags 2.11.0", "bstr", "document-features", - "gix-path", + "gix-path 0.11.2", "libc", "serde", "thiserror 2.0.18", ] +[[package]] +name = "gix-config-value" +version = "0.17.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "441a300bc3645a1f45cba495b9175f90f47256ce43f2ee161da0031e3ac77c92" +dependencies = [ + "bitflags 2.11.0", + "bstr", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "libc", + "thiserror 2.0.18", +] + [[package]] name = "gix-credentials" version = "0.37.1" dependencies = [ "bstr", "document-features", - "gix-command", - "gix-config-value", - "gix-date", - "gix-path", + "gix-command 0.8.0", + "gix-config-value 0.17.1", + "gix-date 0.15.1", + "gix-path 0.11.2", "gix-prompt", - "gix-quote", - "gix-sec", + "gix-quote 0.7.0", + "gix-sec 0.13.2", "gix-testtools", - "gix-trace", - "gix-url", + "gix-trace 0.1.18", + "gix-url 0.35.2", "serde", "thiserror 2.0.18", ] @@ -1686,8 +2115,8 @@ version = "0.15.1" dependencies = [ "bstr", "document-features", - "gix-error", - "gix-hash", + "gix-error 0.2.1", + "gix-hash 0.23.0", "gix-testtools", "itoa", "jiff", @@ -1697,46 +2126,81 @@ dependencies = [ ] [[package]] -name = "gix-diff" +name = "gix-date" +version = "0.15.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "39acf819aa9fee65e4838a2eec5cb2506e47ebb89e02a5ab9918196e491571ea" +dependencies = [ + "bstr", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "itoa", + "jiff", + "smallvec", +] + +[[package]] +name = "gix-diff" version = "0.61.0" dependencies = [ "bstr", "criterion", "document-features", "getrandom 0.4.2", - "gix-attributes", - "gix-command", - "gix-filter", - "gix-fs", - "gix-hash", + "gix-attributes 0.31.0", + "gix-command 0.8.0", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", "gix-imara-diff", - "gix-index", - "gix-object", - "gix-path", - "gix-pathspec", - "gix-tempfile", - "gix-trace", - "gix-traverse", - "gix-worktree", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-path 0.11.2", + "gix-pathspec 0.16.1", + "gix-tempfile 21.0.2", + "gix-trace 0.1.18", + "gix-traverse 0.55.0", + "gix-worktree 0.50.0", "serde", "thiserror 2.0.18", ] +[[package]] +name = "gix-diff" +version = "0.61.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "88f3b3475e5d3877d7c30c40827cc2441936ce890efc226e5ba4afe3a7ae33f0" +dependencies = [ + "bstr", + "gix-command 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-filter 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-tempfile 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-traverse 0.55.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree 0.50.0 (registry+https://github.com/rust-lang/crates.io-index)", + "imara-diff 0.1.8", + "imara-diff 0.2.0", + "thiserror 2.0.18", +] + [[package]] name = "gix-diff-tests" version = "0.0.0" dependencies = [ - "gix-diff", - "gix-filter", - "gix-fs", - "gix-hash", - "gix-index", - "gix-object", - "gix-odb", - "gix-pathspec", + "gix-diff 0.61.0", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-pathspec 0.16.1", "gix-testtools", - "gix-traverse", - "gix-worktree", + "gix-traverse 0.55.0", + "gix-worktree 0.50.0", "insta", "pretty_assertions", "shell-words", @@ -1747,21 +2211,41 @@ name = "gix-dir" version = "0.23.0" dependencies = [ "bstr", - "gix-discover", - "gix-fs", - "gix-ignore", - "gix-index", - "gix-object", - "gix-path", - "gix-pathspec", + "gix-discover 0.49.0", + "gix-fs 0.19.2", + "gix-ignore 0.19.1", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-path 0.11.2", + "gix-pathspec 0.16.1", "gix-testtools", - "gix-trace", - "gix-utils", - "gix-worktree", + "gix-trace 0.1.18", + "gix-utils 0.3.1", + "gix-worktree 0.50.0", "pretty_assertions", "thiserror 2.0.18", ] +[[package]] +name = "gix-dir" +version = "0.23.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5da4604a360988f0ba8efe6f90093ca5a844f4a7f8e1a3dcda501ec44e600ea9" +dependencies = [ + "bstr", + "gix-discover 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-ignore 0.19.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-index 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-pathspec 0.16.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree 0.50.0 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + [[package]] name = "gix-discover" version = "0.49.0" @@ -1769,10 +2253,10 @@ dependencies = [ "bstr", "defer", "dunce", - "gix-fs", - "gix-path", - "gix-ref", - "gix-sec", + "gix-fs 0.19.2", + "gix-path 0.11.2", + "gix-ref 0.61.0", + "gix-sec 0.13.2", "gix-testtools", "is_ci", "serial_test", @@ -1780,6 +2264,21 @@ dependencies = [ "thiserror 2.0.18", ] +[[package]] +name = "gix-discover" +version = "0.49.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c65bd3330fe0cb9d40d875bf862fd5e8ad6fa4164ddbc4842fbeb889c3f0b2c6" +dependencies = [ + "bstr", + "dunce", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-ref 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-sec 0.13.2 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + [[package]] name = "gix-error" version = "0.2.1" @@ -1787,10 +2286,19 @@ dependencies = [ "anyhow", "bstr", "document-features", - "gix-error", + "gix-error 0.2.1", "insta", ] +[[package]] +name = "gix-error" +version = "0.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2e86d01da904d4a9265def43bd42a18c5e6dc7000a73af512946ba14579c9fbd" +dependencies = [ + "bstr", +] + [[package]] name = "gix-features" version = "0.46.2" @@ -1801,9 +2309,9 @@ dependencies = [ "crc32fast", "crossbeam-channel", "document-features", - "gix-path", - "gix-trace", - "gix-utils", + "gix-path 0.11.2", + "gix-trace 0.1.18", + "gix-utils 0.3.1", "libc", "once_cell", "parking_lot", @@ -1813,6 +2321,25 @@ dependencies = [ "zlib-rs", ] +[[package]] +name = "gix-features" +version = "0.46.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "752493cd4b1d5eaaa0138a7493f65c96863fefa990fc021e0e519579e389ab20" +dependencies = [ + "bytes", + "crc32fast", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "libc", + "once_cell", + "prodash", + "thiserror 2.0.18", + "walkdir", + "zlib-rs", +] + [[package]] name = "gix-fetchhead" version = "0.0.0" @@ -1823,22 +2350,43 @@ version = "0.28.0" dependencies = [ "bstr", "encoding_rs", - "gix-attributes", - "gix-command", - "gix-hash", - "gix-object", - "gix-packetline", - "gix-path", - "gix-quote", + "gix-attributes 0.31.0", + "gix-command 0.8.0", + "gix-hash 0.23.0", + "gix-object 0.58.0", + "gix-packetline 0.21.2", + "gix-path 0.11.2", + "gix-quote 0.7.0", "gix-testtools", - "gix-trace", - "gix-utils", - "gix-worktree", + "gix-trace 0.1.18", + "gix-utils 0.3.1", + "gix-worktree 0.50.0", "serial_test", "smallvec", "thiserror 2.0.18", ] +[[package]] +name = "gix-filter" +version = "0.28.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d37598282a6566da6fb52667570c7fe0aedcb122ac886724a9e62a2180523e35" +dependencies = [ + "bstr", + "encoding_rs", + "gix-attributes 0.31.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-command 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-packetline 0.21.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-quote 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "smallvec", + "thiserror 2.0.18", +] + [[package]] name = "gix-fs" version = "0.19.2" @@ -1846,23 +2394,37 @@ dependencies = [ "bstr", "crossbeam-channel", "fastrand", - "gix-features", - "gix-path", - "gix-utils", + "gix-features 0.46.2", + "gix-path 0.11.2", + "gix-utils 0.3.1", "is_ci", "serde", "tempfile", "thiserror 2.0.18", ] +[[package]] +name = "gix-fs" +version = "0.19.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a964b4aec683eb0bacb87533defa80805bb4768056371a47ab38b00a2d377b72" +dependencies = [ + "bstr", + "fastrand", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + [[package]] name = "gix-fsck" version = "0.19.0" dependencies = [ - "gix-hash", - "gix-hashtable", - "gix-object", - "gix-odb", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", "gix-testtools", ] @@ -1873,19 +2435,31 @@ dependencies = [ "bitflags 2.11.0", "bstr", "document-features", - "gix-features", - "gix-path", + "gix-features 0.46.2", + "gix-path 0.11.2", "gix-testtools", "serde", ] +[[package]] +name = "gix-glob" +version = "0.24.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "b03e6cd88cc0dc1eafa1fddac0fb719e4e74b6ea58dd016e71125fde4a326bee" +dependencies = [ + "bitflags 2.11.0", + "bstr", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", +] + [[package]] name = "gix-hash" version = "0.23.0" dependencies = [ "document-features", "faster-hex", - "gix-features", + "gix-features 0.46.2", "gix-testtools", "serde", "sha1-checked", @@ -1893,11 +2467,34 @@ dependencies = [ "thiserror 2.0.18", ] +[[package]] +name = "gix-hash" +version = "0.23.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0fb896a02d9ab96fa518475a5f30ad3952010f801a8de5840f633f4a6b985dfb" +dependencies = [ + "faster-hex", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "sha1-checked", + "thiserror 2.0.18", +] + [[package]] name = "gix-hashtable" version = "0.13.0" dependencies = [ - "gix-hash", + "gix-hash 0.23.0", + "hashbrown 0.16.1", + "parking_lot", +] + +[[package]] +name = "gix-hashtable" +version = "0.13.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2664216fc5e89b51e756a4a3ac676315602ce2dac07acf1da959a22038d69b33" +dependencies = [ + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", "hashbrown 0.16.1", "parking_lot", ] @@ -1908,15 +2505,28 @@ version = "0.19.1" dependencies = [ "bstr", "document-features", - "gix-fs", - "gix-glob", - "gix-path", + "gix-fs 0.19.2", + "gix-glob 0.24.0", + "gix-path 0.11.2", "gix-testtools", - "gix-trace", + "gix-trace 0.1.18", "serde", "unicode-bom", ] +[[package]] +name = "gix-ignore" +version = "0.19.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "09f915dcf6911e3027537166d34e13f0fe101ed12225178d2ae29cd1272cff26" +dependencies = [ + "bstr", + "gix-glob 0.24.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "unicode-bom", +] + [[package]] name = "gix-imara-diff" version = "0.2.0" @@ -1925,9 +2535,9 @@ dependencies = [ "bstr", "cov-mark", "expect-test", - "gix-hash", + "gix-hash 0.23.0", "gix-imara-diff", - "gix-object", + "gix-object 0.58.0", "hashbrown 0.16.1", "memchr", ] @@ -1941,16 +2551,16 @@ dependencies = [ "document-features", "filetime", "fnv", - "gix-bitmap", - "gix-features", - "gix-fs", - "gix-hash", - "gix-lock", - "gix-object", + "gix-bitmap 0.3.0", + "gix-features 0.46.2", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-lock 21.0.2", + "gix-object 0.58.0", "gix-testtools", - "gix-traverse", - "gix-utils", - "gix-validate", + "gix-traverse 0.55.0", + "gix-utils 0.3.1", + "gix-validate 0.11.0", "hashbrown 0.16.1", "itoa", "libc", @@ -1961,17 +2571,45 @@ dependencies = [ "thiserror 2.0.18", ] +[[package]] +name = "gix-index" +version = "0.49.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1bae54ab14e4e74d5dda60b82ea7afad7c8eb3be68283d6d5f29bd2e6d47fff7" +dependencies = [ + "bitflags 2.11.0", + "bstr", + "filetime", + "fnv", + "gix-bitmap 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-lock 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-traverse 0.55.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-validate 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", + "hashbrown 0.16.1", + "itoa", + "libc", + "memmap2", + "rustix", + "smallvec", + "thiserror 2.0.18", +] + [[package]] name = "gix-index-tests" version = "0.0.0" dependencies = [ "bstr", "filetime", - "gix-features", - "gix-hash", - "gix-index", - "gix-object", - "gix-odb", + "gix-features 0.46.2", + "gix-hash 0.23.0", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", "gix-testtools", ] @@ -1983,12 +2621,23 @@ version = "0.0.0" name = "gix-lock" version = "21.0.2" dependencies = [ - "gix-tempfile", - "gix-utils", + "gix-tempfile 21.0.2", + "gix-utils 0.3.1", "tempfile", "thiserror 2.0.18", ] +[[package]] +name = "gix-lock" +version = "21.0.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "054fbd0989700c69dc5aa80bc66944f05df1e15aa7391a9e42aca7366337905f" +dependencies = [ + "gix-tempfile 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + [[package]] name = "gix-macros" version = "0.1.5" @@ -2005,9 +2654,9 @@ version = "0.32.0" dependencies = [ "bstr", "document-features", - "gix-actor", - "gix-date", - "gix-error", + "gix-actor 0.40.0", + "gix-date 0.15.1", + "gix-error 0.2.1", "gix-testtools", "serde", ] @@ -2019,24 +2668,24 @@ dependencies = [ "arbitrary", "bstr", "document-features", - "gix-command", - "gix-diff", - "gix-filter", - "gix-fs", - "gix-hash", + "gix-command 0.8.0", + "gix-diff 0.61.0", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", "gix-imara-diff", - "gix-index", - "gix-object", - "gix-odb", - "gix-path", - "gix-quote", - "gix-revision", - "gix-revwalk", - "gix-tempfile", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-path 0.11.2", + "gix-quote 0.7.0", + "gix-revision 0.43.0", + "gix-revwalk 0.29.0", + "gix-tempfile 21.0.2", "gix-testtools", - "gix-trace", - "gix-utils", - "gix-worktree", + "gix-trace 0.1.18", + "gix-utils 0.3.1", + "gix-worktree 0.50.0", "nonempty", "pretty_assertions", "serde", @@ -2044,21 +2693,61 @@ dependencies = [ "thiserror 2.0.18", ] +[[package]] +name = "gix-merge" +version = "0.14.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f4606747466512d22c2dffc019142e1941238f543987ea51353c938cca80c500" +dependencies = [ + "bstr", + "gix-command 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-diff 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-filter 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-index 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-quote 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revision 0.43.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revwalk 0.29.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-tempfile 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree 0.50.0 (registry+https://github.com/rust-lang/crates.io-index)", + "imara-diff 0.1.8", + "nonempty", + "thiserror 2.0.18", +] + [[package]] name = "gix-negotiate" version = "0.29.0" dependencies = [ "bitflags 2.11.0", - "gix-commitgraph", - "gix-date", - "gix-hash", - "gix-object", - "gix-odb", - "gix-ref", - "gix-revwalk", + "gix-commitgraph 0.35.0", + "gix-date 0.15.1", + "gix-hash 0.23.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-ref 0.61.0", + "gix-revwalk 0.29.0", "gix-testtools", ] +[[package]] +name = "gix-negotiate" +version = "0.29.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6ea064c7595eea08fdd01c70748af747d9acc40f727b61f4c8a2145a5c5fc28c" +dependencies = [ + "bitflags 2.11.0", + "gix-commitgraph 0.35.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revwalk 0.29.0 (registry+https://github.com/rust-lang/crates.io-index)", +] + [[package]] name = "gix-note" version = "0.0.0" @@ -2070,22 +2759,43 @@ dependencies = [ "bstr", "criterion", "document-features", - "gix-actor", - "gix-date", - "gix-features", - "gix-hash", - "gix-hashtable", - "gix-odb", + "gix-actor 0.40.0", + "gix-date 0.15.1", + "gix-features 0.46.2", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-odb 0.78.0", "gix-testtools", - "gix-utils", - "gix-validate", + "gix-utils 0.3.1", + "gix-validate 0.11.0", "itoa", "pretty_assertions", "serde", "smallvec", "termtree", "thiserror 2.0.18", - "winnow", + "winnow 1.0.0", +] + +[[package]] +name = "gix-object" +version = "0.58.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cafb802bb688a7c1e69ef965612ff5ff859f046bfb616377e4a0ba4c01e43d47" +dependencies = [ + "bstr", + "gix-actor 0.40.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hashtable 0.13.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-validate 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", + "itoa", + "smallvec", + "thiserror 2.0.18", + "winnow 0.7.15", ] [[package]] @@ -2094,33 +2804,53 @@ version = "0.78.0" dependencies = [ "arc-swap", "document-features", - "gix-features", - "gix-fs", - "gix-hash", - "gix-hashtable", - "gix-object", - "gix-pack", - "gix-path", - "gix-quote", + "gix-features 0.46.2", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-object 0.58.0", + "gix-pack 0.68.0", + "gix-path 0.11.2", + "gix-quote 0.7.0", "parking_lot", "serde", "tempfile", "thiserror 2.0.18", ] +[[package]] +name = "gix-odb" +version = "0.78.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "24833ae9323b4f7079575fb9f961cf9c414b0afbec428a536ab8e7dd93bc002b" +dependencies = [ + "arc-swap", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hashtable 0.13.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-pack 0.68.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-quote 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "parking_lot", + "tempfile", + "thiserror 2.0.18", +] + [[package]] name = "gix-odb-tests" version = "0.0.0" dependencies = [ "crossbeam-channel", "filetime", - "gix-actor", - "gix-date", - "gix-features", - "gix-hash", - "gix-object", - "gix-odb", - "gix-pack", + "gix-actor 0.40.0", + "gix-date 0.15.1", + "gix-features 0.46.2", + "gix-hash 0.23.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-pack 0.68.0", "gix-testtools", "maplit", "pretty_assertions", @@ -2132,17 +2862,17 @@ version = "0.68.0" dependencies = [ "clru", "document-features", - "gix-chunk", - "gix-diff", - "gix-error", - "gix-features", - "gix-hash", - "gix-hashtable", - "gix-object", - "gix-path", - "gix-tempfile", + "gix-chunk 0.7.0", + "gix-diff 0.61.0", + "gix-error 0.2.1", + "gix-features 0.46.2", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-object 0.58.0", + "gix-path 0.11.2", + "gix-tempfile 21.0.2", "gix-testtools", - "gix-traverse", + "gix-traverse 0.55.0", "memmap2", "parking_lot", "serde", @@ -2151,18 +2881,37 @@ dependencies = [ "uluru", ] +[[package]] +name = "gix-pack" +version = "0.68.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e3484119cd19859d7d7639413c27e192478fa354d3f4ff5f7e3c041e8040f0f4" +dependencies = [ + "clru", + "gix-chunk 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hashtable 0.13.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "memmap2", + "smallvec", + "thiserror 2.0.18", +] + [[package]] name = "gix-pack-tests" version = "0.0.0" dependencies = [ "bstr", - "gix-features", - "gix-hash", - "gix-object", - "gix-odb", - "gix-pack", + "gix-features 0.46.2", + "gix-hash 0.23.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-pack 0.68.0", "gix-testtools", - "gix-traverse", + "gix-traverse 0.55.0", "maplit", "memmap2", ] @@ -2177,52 +2926,91 @@ dependencies = [ "faster-hex", "futures-io", "futures-lite", - "gix-hash", - "gix-odb", - "gix-pack", - "gix-trace", + "gix-hash 0.23.0", + "gix-odb 0.78.0", + "gix-pack 0.68.0", + "gix-trace 0.1.18", "maybe-async", "pin-project-lite", "serde", "thiserror 2.0.18", ] +[[package]] +name = "gix-packetline" +version = "0.21.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "be19313dcdb7dff75a3ce2f99be00878458295bcc3b6c7f0005591597573345c" +dependencies = [ + "bstr", + "faster-hex", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + [[package]] name = "gix-path" version = "0.11.2" dependencies = [ "bstr", "gix-testtools", - "gix-trace", - "gix-validate", + "gix-trace 0.1.18", + "gix-validate 0.11.0", "serial_test", "thiserror 2.0.18", "windows 0.62.2", "winreg 0.56.0", ] +[[package]] +name = "gix-path" +version = "0.11.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "09c31d4373bda7fab9eb01822927b55185a378d6e1bf737e0a54c743ad806658" +dependencies = [ + "bstr", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-validate 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + [[package]] name = "gix-pathspec" version = "0.16.1" dependencies = [ "bitflags 2.11.0", "bstr", - "gix-attributes", - "gix-config-value", - "gix-glob", - "gix-path", + "gix-attributes 0.31.0", + "gix-config-value 0.17.1", + "gix-glob 0.24.0", + "gix-path 0.11.2", "gix-testtools", "serial_test", "thiserror 2.0.18", ] +[[package]] +name = "gix-pathspec" +version = "0.16.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f89611f13544ca5ebeb68a502673814ef57200df60c24a61c2ce7b96f612f08b" +dependencies = [ + "bitflags 2.11.0", + "bstr", + "gix-attributes 0.31.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-config-value 0.17.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-glob 0.24.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + [[package]] name = "gix-prompt" version = "0.14.1" dependencies = [ "expectrl", - "gix-command", - "gix-config-value", + "gix-command 0.8.0", + "gix-config-value 0.17.1", "gix-testtools", "parking_lot", "rustix", @@ -2241,34 +3029,65 @@ dependencies = [ "futures-io", "futures-lite", "gix-credentials", - "gix-date", - "gix-features", - "gix-hash", - "gix-lock", - "gix-negotiate", - "gix-object", - "gix-packetline", - "gix-ref", - "gix-refspec", - "gix-revwalk", - "gix-shallow", - "gix-trace", - "gix-transport", - "gix-utils", + "gix-date 0.15.1", + "gix-features 0.46.2", + "gix-hash 0.23.0", + "gix-lock 21.0.2", + "gix-negotiate 0.29.0", + "gix-object 0.58.0", + "gix-packetline 0.21.2", + "gix-ref 0.61.0", + "gix-refspec 0.39.0", + "gix-revwalk 0.29.0", + "gix-shallow 0.10.0", + "gix-trace 0.1.18", + "gix-transport 0.55.1", + "gix-utils 0.3.1", "maybe-async", "nonempty", "serde", "thiserror 2.0.18", - "winnow", + "winnow 1.0.0", +] + +[[package]] +name = "gix-protocol" +version = "0.59.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4f38666350736b5877c79f57ddae02bde07a4ce186d889adc391e831cddcbe76" +dependencies = [ + "bstr", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-ref 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-shallow 0.10.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-transport 0.55.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "maybe-async", + "nonempty", + "thiserror 2.0.18", + "winnow 0.7.15", +] + +[[package]] +name = "gix-quote" +version = "0.7.0" +dependencies = [ + "bstr", + "gix-error 0.2.1", + "gix-utils 0.3.1", ] [[package]] name = "gix-quote" version = "0.7.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "68533db71259c8776dd4e770d2b7b98696213ecdc1f5c9e3507119e274e0c578" dependencies = [ "bstr", - "gix-error", - "gix-utils", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", ] [[package]] @@ -2280,40 +3099,61 @@ name = "gix-ref" version = "0.61.0" dependencies = [ "document-features", - "gix-actor", - "gix-date", - "gix-features", - "gix-fs", - "gix-hash", - "gix-lock", - "gix-object", - "gix-path", - "gix-tempfile", + "gix-actor 0.40.0", + "gix-date 0.15.1", + "gix-features 0.46.2", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-lock 21.0.2", + "gix-object 0.58.0", + "gix-path 0.11.2", + "gix-tempfile 21.0.2", "gix-testtools", - "gix-utils", - "gix-validate", + "gix-utils 0.3.1", + "gix-validate 0.11.0", "memmap2", "serde", "thiserror 2.0.18", - "winnow", + "winnow 1.0.0", +] + +[[package]] +name = "gix-ref" +version = "0.61.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "c2159978abb99b7027c8579d15211e262ef0ef2594d5cecb3334fbcbdfe2997c" +dependencies = [ + "gix-actor 0.40.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-lock 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-tempfile 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-validate 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", + "memmap2", + "thiserror 2.0.18", + "winnow 0.7.15", ] [[package]] name = "gix-ref-tests" version = "0.0.0" dependencies = [ - "gix-actor", - "gix-date", - "gix-discover", - "gix-features", - "gix-fs", - "gix-hash", - "gix-lock", - "gix-object", - "gix-odb", - "gix-ref", + "gix-actor 0.40.0", + "gix-date 0.15.1", + "gix-discover 0.49.0", + "gix-features 0.46.2", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-lock 21.0.2", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-ref 0.61.0", "gix-testtools", - "gix-validate", + "gix-validate 0.11.0", "insta", ] @@ -2322,17 +3162,33 @@ name = "gix-refspec" version = "0.39.0" dependencies = [ "bstr", - "gix-error", - "gix-glob", - "gix-hash", - "gix-revision", + "gix-error 0.2.1", + "gix-glob 0.24.0", + "gix-hash 0.23.0", + "gix-revision 0.43.0", "gix-testtools", - "gix-validate", + "gix-validate 0.11.0", "insta", "smallvec", "thiserror 2.0.18", ] +[[package]] +name = "gix-refspec" +version = "0.39.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "dc806ee13f437428f8a1ba4c72ecfaa3f20e14f5f0d4c2bc17d0b33e794aa6ac" +dependencies = [ + "bstr", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-glob 0.24.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revision 0.43.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-validate 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", + "smallvec", + "thiserror 2.0.18", +] + [[package]] name = "gix-revision" version = "0.43.0" @@ -2340,50 +3196,97 @@ dependencies = [ "bitflags 2.11.0", "bstr", "document-features", - "gix-commitgraph", - "gix-date", - "gix-error", - "gix-hash", - "gix-hashtable", - "gix-object", - "gix-odb", - "gix-revwalk", + "gix-commitgraph 0.35.0", + "gix-date 0.15.1", + "gix-error 0.2.1", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-revwalk 0.29.0", "gix-testtools", - "gix-trace", + "gix-trace 0.1.18", "insta", "nonempty", - "permutohedron", - "serde", + "permutohedron", + "serde", +] + +[[package]] +name = "gix-revision" +version = "0.43.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7c08f1ec5d1e6a524f8ba291c41f0ccaef64e48ed0e8cf790b3461cae45f6d3d" +dependencies = [ + "bitflags 2.11.0", + "bstr", + "gix-commitgraph 0.35.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hashtable 0.13.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revwalk 0.29.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-trace 0.1.18 (registry+https://github.com/rust-lang/crates.io-index)", + "nonempty", ] [[package]] name = "gix-revwalk" version = "0.29.0" dependencies = [ - "gix-commitgraph", - "gix-date", - "gix-error", - "gix-hash", - "gix-hashtable", - "gix-object", + "gix-commitgraph 0.35.0", + "gix-date 0.15.1", + "gix-error 0.2.1", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-object 0.58.0", "gix-testtools", "smallvec", "thiserror 2.0.18", ] +[[package]] +name = "gix-revwalk" +version = "0.29.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0e4b2b87772b21ca449249e86d32febadba5cba32b0fcce804ab9cefc6f2111c" +dependencies = [ + "gix-commitgraph 0.35.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hashtable 0.13.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "smallvec", + "thiserror 2.0.18", +] + [[package]] name = "gix-sec" version = "0.13.2" dependencies = [ "bitflags 2.11.0", "document-features", - "gix-path", + "gix-path 0.11.2", "libc", "serde", "tempfile", "windows-sys 0.61.2", ] +[[package]] +name = "gix-sec" +version = "0.13.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bf82ae037de9c62850ce67beaa92ec8e3e17785ea307cdde7618edc215603b4f" +dependencies = [ + "bitflags 2.11.0", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "libc", + "windows-sys 0.61.2", +] + [[package]] name = "gix-sequencer" version = "0.0.0" @@ -2393,14 +3296,27 @@ name = "gix-shallow" version = "0.10.0" dependencies = [ "bstr", - "gix-hash", - "gix-lock", + "gix-hash 0.23.0", + "gix-lock 21.0.2", "nonempty", "serde", "tempfile", "thiserror 2.0.18", ] +[[package]] +name = "gix-shallow" +version = "0.10.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "cbf60711c9083b2364b3fac8a352444af76b17201f3682fdebe74fa66d89a772" +dependencies = [ + "bstr", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-lock 21.0.2 (registry+https://github.com/rust-lang/crates.io-index)", + "nonempty", + "thiserror 2.0.18", +] + [[package]] name = "gix-status" version = "0.28.0" @@ -2408,17 +3324,40 @@ dependencies = [ "bstr", "document-features", "filetime", - "gix-diff", - "gix-dir", - "gix-features", - "gix-filter", - "gix-fs", - "gix-hash", - "gix-index", - "gix-object", - "gix-path", - "gix-pathspec", - "gix-worktree", + "gix-diff 0.61.0", + "gix-dir 0.23.0", + "gix-features 0.46.2", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-path 0.11.2", + "gix-pathspec 0.16.1", + "gix-worktree 0.50.0", + "portable-atomic", + "thiserror 2.0.18", +] + +[[package]] +name = "gix-status" +version = "0.28.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "23d6c598e3fdbc352fba1c5ba7e709e69402fafbc44d9295edad2e3c4738996b" +dependencies = [ + "bstr", + "filetime", + "gix-diff 0.61.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-dir 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-filter 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-index 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-pathspec 0.16.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree 0.50.0 (registry+https://github.com/rust-lang/crates.io-index)", "portable-atomic", "thiserror 2.0.18", ] @@ -2429,20 +3368,20 @@ version = "0.0.0" dependencies = [ "bstr", "filetime", - "gix-diff", - "gix-dir", - "gix-features", - "gix-filter", - "gix-fs", - "gix-hash", - "gix-index", - "gix-object", - "gix-odb", - "gix-path", - "gix-pathspec", - "gix-status", + "gix-diff 0.61.0", + "gix-dir 0.23.0", + "gix-features 0.46.2", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-path 0.11.2", + "gix-pathspec 0.16.1", + "gix-status 0.28.0", "gix-testtools", - "gix-worktree", + "gix-worktree 0.50.0", "pretty_assertions", ] @@ -2451,13 +3390,28 @@ name = "gix-submodule" version = "0.28.0" dependencies = [ "bstr", - "gix-config", - "gix-features", - "gix-path", - "gix-pathspec", - "gix-refspec", + "gix-config 0.54.0", + "gix-features 0.46.2", + "gix-path 0.11.2", + "gix-pathspec 0.16.1", + "gix-refspec 0.39.0", "gix-testtools", - "gix-url", + "gix-url 0.35.2", + "thiserror 2.0.18", +] + +[[package]] +name = "gix-submodule" +version = "0.28.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0ce5c3929c5e6821f651d35e8420f72fea3cfafe9fc1e928a61e718b462c72a5" +dependencies = [ + "bstr", + "gix-config 0.54.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-pathspec 0.16.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-refspec 0.39.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-url 0.35.2 (registry+https://github.com/rust-lang/crates.io-index)", "thiserror 2.0.18", ] @@ -2467,7 +3421,7 @@ version = "21.0.2" dependencies = [ "dashmap", "document-features", - "gix-fs", + "gix-fs 0.19.2", "libc", "parking_lot", "signal-hook 0.4.3", @@ -2475,6 +3429,19 @@ dependencies = [ "tempfile", ] +[[package]] +name = "gix-tempfile" +version = "21.0.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d22227f6b203f511ff451c33c89899e87e4f571fc596b06f68e6e613a6508528" +dependencies = [ + "dashmap", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "libc", + "parking_lot", + "tempfile", +] + [[package]] name = "gix-testtools" version = "0.19.0" @@ -2484,19 +3451,19 @@ dependencies = [ "document-features", "fastrand", "fs_extra", - "gix-discover", - "gix-fs", - "gix-hash", - "gix-lock", - "gix-tempfile", - "gix-worktree", + "gix-discover 0.49.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-lock 21.0.2", + "gix-tempfile 21.0.2", + "gix-worktree 0.50.0", "io-close", "is_ci", "parking_lot", "serial_test", "tar", "tempfile", - "winnow", + "winnow 1.0.0", "xz2", ] @@ -2512,6 +3479,12 @@ dependencies = [ "tracing", ] +[[package]] +name = "gix-trace" +version = "0.1.18" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f69a13643b8437d4ca6845e08143e847a36ca82903eed13303475d0ae8b162e0" + [[package]] name = "gix-transport" version = "0.55.1" @@ -2525,15 +3498,15 @@ dependencies = [ "document-features", "futures-io", "futures-lite", - "gix-command", + "gix-command 0.8.0", "gix-credentials", - "gix-features", - "gix-hash", - "gix-pack", - "gix-packetline", - "gix-quote", - "gix-sec", - "gix-url", + "gix-features 0.46.2", + "gix-hash 0.23.0", + "gix-pack 0.68.0", + "gix-packetline 0.21.2", + "gix-quote 0.7.0", + "gix-sec 0.13.2", + "gix-url 0.35.2", "maybe-async", "pin-project-lite", "reqwest", @@ -2541,17 +3514,50 @@ dependencies = [ "thiserror 2.0.18", ] +[[package]] +name = "gix-transport" +version = "0.55.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a521e39c6235ce63ed6c001e2dd79818c830b82c3b7b59247ee7b229c39ec9bb" +dependencies = [ + "bstr", + "gix-command 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-packetline 0.21.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-quote 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-sec 0.13.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-url 0.35.2 (registry+https://github.com/rust-lang/crates.io-index)", + "thiserror 2.0.18", +] + +[[package]] +name = "gix-traverse" +version = "0.55.0" +dependencies = [ + "bitflags 2.11.0", + "gix-commitgraph 0.35.0", + "gix-date 0.15.1", + "gix-hash 0.23.0", + "gix-hashtable 0.13.0", + "gix-object 0.58.0", + "gix-revwalk 0.29.0", + "smallvec", + "thiserror 2.0.18", +] + [[package]] name = "gix-traverse" version = "0.55.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "963dc2afcdb611092aa587c3f9365e749ac0a0892ff27662dbc75f26c953fbec" dependencies = [ "bitflags 2.11.0", - "gix-commitgraph", - "gix-date", - "gix-hash", - "gix-hashtable", - "gix-object", - "gix-revwalk", + "gix-commitgraph 0.35.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-date 0.15.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hashtable 0.13.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-revwalk 0.29.0 (registry+https://github.com/rust-lang/crates.io-index)", "smallvec", "thiserror 2.0.18", ] @@ -2560,13 +3566,13 @@ dependencies = [ name = "gix-traverse-tests" version = "0.0.0" dependencies = [ - "gix-commitgraph", - "gix-hash", - "gix-object", - "gix-odb", - "gix-path", + "gix-commitgraph 0.35.0", + "gix-hash 0.23.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-path 0.11.2", "gix-testtools", - "gix-traverse", + "gix-traverse 0.55.0", "insta", ] @@ -2581,16 +3587,39 @@ dependencies = [ "assert_matches", "bstr", "document-features", - "gix-path", + "gix-path 0.11.2", "gix-testtools", "percent-encoding", "serde", "thiserror 2.0.18", ] +[[package]] +name = "gix-url" +version = "0.35.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d28e8af3d42581190da884f013caf254d2fd4d6ab102408f08d21bfa11de6c8d" +dependencies = [ + "bstr", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "percent-encoding", + "thiserror 2.0.18", +] + +[[package]] +name = "gix-utils" +version = "0.3.1" +dependencies = [ + "bstr", + "fastrand", + "unicode-normalization", +] + [[package]] name = "gix-utils" version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "befcdbdfb1238d2854591f760a48711bed85e72d80a10e8f2f93f656746ef7c5" dependencies = [ "bstr", "fastrand", @@ -2605,37 +3634,82 @@ dependencies = [ "gix-testtools", ] +[[package]] +name = "gix-validate" +version = "0.11.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "0ec1eff98d91941f47766367cba1be746bab662bad761d9891ae6f7882f7840b" +dependencies = [ + "bstr", +] + [[package]] name = "gix-worktree" version = "0.50.0" dependencies = [ "bstr", "document-features", - "gix-attributes", - "gix-fs", - "gix-glob", - "gix-hash", - "gix-ignore", - "gix-index", - "gix-object", - "gix-path", - "gix-validate", + "gix-attributes 0.31.0", + "gix-fs 0.19.2", + "gix-glob 0.24.0", + "gix-hash 0.23.0", + "gix-ignore 0.19.1", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-path 0.11.2", + "gix-validate 0.11.0", "serde", ] +[[package]] +name = "gix-worktree" +version = "0.50.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e6bd5830cbc43c9c00918b826467d2afad685b195cb82329cde2b2d116d2c578" +dependencies = [ + "bstr", + "gix-attributes 0.31.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-glob 0.24.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-ignore 0.19.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-index 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-validate 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)", +] + +[[package]] +name = "gix-worktree-state" +version = "0.28.0" +dependencies = [ + "bstr", + "gix-features 0.46.2", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-path 0.11.2", + "gix-worktree 0.50.0", + "gix-worktree-state 0.28.0", + "io-close", + "thiserror 2.0.18", +] + [[package]] name = "gix-worktree-state" version = "0.28.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "644a1681f96e1be43c2a8384337d9d220e7624f50db54beda70997052aebf707" dependencies = [ "bstr", - "gix-features", - "gix-filter", - "gix-fs", - "gix-index", - "gix-object", - "gix-path", - "gix-worktree", - "gix-worktree-state", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-filter 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-index 0.49.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-worktree 0.50.0 (registry+https://github.com/rust-lang/crates.io-index)", "io-close", "thiserror 2.0.18", ] @@ -2644,16 +3718,16 @@ dependencies = [ name = "gix-worktree-state-tests" version = "0.0.0" dependencies = [ - "gix-discover", - "gix-features", - "gix-filter", - "gix-fs", - "gix-hash", - "gix-index", - "gix-object", - "gix-odb", + "gix-discover 0.49.0", + "gix-features 0.46.2", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", "gix-testtools", - "gix-worktree-state", + "gix-worktree-state 0.28.0", "symlink", "walkdir", ] @@ -2662,18 +3736,36 @@ dependencies = [ name = "gix-worktree-stream" version = "0.30.0" dependencies = [ - "gix-attributes", - "gix-error", - "gix-features", - "gix-filter", - "gix-fs", - "gix-hash", - "gix-object", - "gix-odb", - "gix-path", + "gix-attributes 0.31.0", + "gix-error 0.2.1", + "gix-features 0.46.2", + "gix-filter 0.28.0", + "gix-fs 0.19.2", + "gix-hash 0.23.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-path 0.11.2", "gix-testtools", - "gix-traverse", - "gix-worktree", + "gix-traverse 0.55.0", + "gix-worktree 0.50.0", + "parking_lot", +] + +[[package]] +name = "gix-worktree-stream" +version = "0.30.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "24e3fb70a1f650a5cec7d5b8d10d6d6fe86daf3cf15bde08ba0c70988a2932c3" +dependencies = [ + "gix-attributes 0.31.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-error 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-features 0.46.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-filter 0.28.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-fs 0.19.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-hash 0.23.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-object 0.58.0 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-path 0.11.2 (registry+https://github.com/rust-lang/crates.io-index)", + "gix-traverse 0.55.0 (registry+https://github.com/rust-lang/crates.io-index)", "parking_lot", ] @@ -2682,19 +3774,19 @@ name = "gix-worktree-tests" version = "0.0.0" dependencies = [ "bstr", - "gix-attributes", - "gix-discover", - "gix-features", - "gix-fs", - "gix-glob", - "gix-hash", - "gix-ignore", - "gix-index", - "gix-object", - "gix-odb", - "gix-path", + "gix-attributes 0.31.0", + "gix-discover 0.49.0", + "gix-features 0.46.2", + "gix-fs 0.19.2", + "gix-glob 0.24.0", + "gix-hash 0.23.0", + "gix-ignore 0.19.1", + "gix-index 0.49.0", + "gix-object 0.58.0", + "gix-odb 0.78.0", + "gix-path 0.11.2", "gix-testtools", - "gix-worktree", + "gix-worktree 0.50.0", "symlink", ] @@ -2704,6 +3796,19 @@ version = "0.3.3" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0cc23270f6e1808e30a928bdc84dea0b9b4136a8bc82338574f23baf47bbd280" +[[package]] +name = "globset" +version = "0.4.18" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "52dfc19153a48bde0cbd630453615c8151bce3a5adfac7a0aebfbf0a1e1f57e3" +dependencies = [ + "aho-corasick", + "bstr", + "log", + "regex-automata", + "regex-syntax", +] + [[package]] name = "gloo-timers" version = "0.3.0" @@ -2812,6 +3917,12 @@ version = "0.5.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "fc0fef456e4baa96da950455cd02c081ca953b141298e41db3fc7e36b1da849c" +[[package]] +name = "hex" +version = "0.4.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7f24254aa9a54b5c858eaee2f5bccdb46aaf0e486a595ed5fd8f86ba55232a70" + [[package]] name = "hickory-proto" version = "0.25.2" @@ -2828,7 +3939,7 @@ dependencies = [ "idna", "ipnet", "once_cell", - "rand", + "rand 0.9.4", "ring", "thiserror 2.0.18", "tinyvec", @@ -2850,7 +3961,7 @@ dependencies = [ "moka", "once_cell", "parking_lot", - "rand", + "rand 0.9.4", "resolv-conf", "smallvec", "thiserror 2.0.18", @@ -2980,6 +4091,30 @@ dependencies = [ "tracing", ] +[[package]] +name = "iana-time-zone" +version = "0.1.65" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e31bc9ad994ba00e440a8aa5c9ef0ec67d5cb5e5cb0cc7f8b744a35b389cc470" +dependencies = [ + "android_system_properties", + "core-foundation-sys", + "iana-time-zone-haiku", + "js-sys", + "log", + "wasm-bindgen", + "windows-core", +] + +[[package]] +name = "iana-time-zone-haiku" +version = "0.1.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f31827a206f56af32e590ba56d5d2d085f558508192593743f16b2306495269f" +dependencies = [ + "cc", +] + [[package]] name = "icu_collections" version = "2.1.1" @@ -3094,6 +4229,25 @@ dependencies = [ "icu_properties", ] +[[package]] +name = "imara-diff" +version = "0.1.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "17d34b7d42178945f775e84bc4c36dde7c1c6cdfea656d3354d009056f2bb3d2" +dependencies = [ + "hashbrown 0.15.5", +] + +[[package]] +name = "imara-diff" +version = "0.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2f01d462f766df78ab820dd06f5eb700233c51f0f4c2e846520eaf4ba6aa5c5c" +dependencies = [ + "hashbrown 0.15.5", + "memchr", +] + [[package]] name = "indexmap" version = "2.13.0" @@ -3146,7 +4300,7 @@ version = "0.0.0" dependencies = [ "anyhow", "clap", - "gix", + "gix 0.81.0", "regex", ] @@ -3833,6 +4987,16 @@ version = "0.2.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b687ff7b5da449d39e418ad391e5e08da53ec334903ddbb921db208908fc372c" +[[package]] +name = "petgraph" +version = "0.7.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "3672b37090dbd86368a4145bc067582552b29c27377cad4e0a306c97f9bd7772" +dependencies = [ + "fixedbitset", + "indexmap", +] + [[package]] name = "pin-project-lite" version = "0.2.17" @@ -3856,6 +5020,16 @@ dependencies = [ "futures-io", ] +[[package]] +name = "pkcs8" +version = "0.10.2" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "f950b2377845cebe5cf8b5165cb3cc1a5e0fa5cfa3e1f7f55707d8fd82e0a7b7" +dependencies = [ + "der", + "spki", +] + [[package]] name = "pkg-config" version = "0.3.32" @@ -4040,7 +5214,7 @@ dependencies = [ "bytes", "getrandom 0.3.4", "lru-slab", - "rand", + "rand 0.9.4", "ring", "rustc-hash", "rustls", @@ -4087,14 +5261,56 @@ version = "6.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f8dcc9c7d52a811697d2151c701e0d08956f92b0e24136cf4cf27b57a6a0d9bf" +[[package]] +name = "rakia-brit" +version = "0.1.0" +dependencies = [ + "gix 0.81.0 (registry+https://github.com/rust-lang/crates.io-index)", + "serde", + "serde_json", + "thiserror 2.0.18", +] + +[[package]] +name = "rakia-core" +version = "0.1.0" +dependencies = [ + "chrono", + "globset", + "serde", + "serde_json", + "thiserror 2.0.18", +] + +[[package]] +name = "rand" +version = "0.8.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "5ca0ecfa931c29007047d1bc58e623ab12e5590e8c7cc53200d5202b69266d8a" +dependencies = [ + "libc", + "rand_chacha 0.3.1", + "rand_core 0.6.4", +] + [[package]] name = "rand" version = "0.9.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "44c5af06bb1b7d3216d91932aed5265164bf384dc89cd6ba05cf59a35f5f76ea" dependencies = [ - "rand_chacha", - "rand_core", + "rand_chacha 0.9.0", + "rand_core 0.9.5", +] + +[[package]] +name = "rand_chacha" +version = "0.3.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88" +dependencies = [ + "ppv-lite86", + "rand_core 0.6.4", ] [[package]] @@ -4104,7 +5320,16 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d3022b5f1df60f26e1ffddd6c66e8aa15de382ae63b3a0c1bfc0e4d3e3f325cb" dependencies = [ "ppv-lite86", - "rand_core", + "rand_core 0.9.5", +] + +[[package]] +name = "rand_core" +version = "0.6.4" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ec0be4795e2f6a28069bec0b5ff3e2ac9bafc99e6a9a7dc3547996c5c816922c" +dependencies = [ + "getrandom 0.2.17", ] [[package]] @@ -4644,7 +5869,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e3bf829a2d51ab4a5ddf1352d8470c140cadc8301b2ae1789db023f01cedd6ba" dependencies = [ "cfg-if", - "cpufeatures", + "cpufeatures 0.2.17", "digest", ] @@ -4665,7 +5890,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a7507d819769d01a365ab707794a4084392c824f54a7a6a7862f8c3d0892b283" dependencies = [ "cfg-if", - "cpufeatures", + "cpufeatures 0.2.17", "digest", ] @@ -4731,6 +5956,15 @@ dependencies = [ "libc", ] +[[package]] +name = "signature" +version = "2.2.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "77549399552de45a898a580c1b41d445bf730df867cc44e6c0233bbc4b8329de" +dependencies = [ + "rand_core 0.6.4", +] + [[package]] name = "simd-adler32" version = "0.3.8" @@ -4778,6 +6012,16 @@ dependencies = [ "windows-sys 0.61.2", ] +[[package]] +name = "spki" +version = "0.7.3" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d91ed6c858b01f942cd56b37a94b3e0a1798290327d1236e4d9cf4eaca44d29d" +dependencies = [ + "base64ct", + "der", +] + [[package]] name = "sqlite-wasm-rs" version = "0.5.2" @@ -5134,7 +6378,7 @@ dependencies = [ "toml_datetime", "toml_parser", "toml_writer", - "winnow", + "winnow 1.0.0", ] [[package]] @@ -5152,7 +6396,7 @@ version = "1.1.1+spec-1.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "39ca317ebc49f06bd748bfba29533eac9485569dc9bf80b849024b025e814fb9" dependencies = [ - "winnow", + "winnow 1.0.0", ] [[package]] @@ -6079,6 +7323,15 @@ version = "0.53.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d6bbff5f0aada427a1e5a6da5f1f98158182f26556f345ac9e04d36d0ebed650" +[[package]] +name = "winnow" +version = "0.7.15" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "df79d97927682d2fd8adb29682d1140b343be4ac0f08fd68b7765d9c059d3945" +dependencies = [ + "memchr", +] + [[package]] name = "winnow" version = "1.0.0" diff --git a/Cargo.toml b/Cargo.toml index 6f16788ed34..0447f9139c8 100644 --- a/Cargo.toml +++ b/Cargo.toml @@ -9,7 +9,7 @@ edition = "2021" license = "MIT OR Apache-2.0" version = "0.52.0" rust-version = "1.82" -default-run = "gix" +default-run = "brit" include = ["/src/**/*", "/build.rs", "/LICENSE-*", "/README.md"] resolver = "2" @@ -21,7 +21,7 @@ test = false doctest = false [[bin]] -name = "gix" +name = "brit" path = "src/gix.rs" doc = false test = false @@ -302,7 +302,14 @@ members = [ "gix-ref/tests", "gix-config/tests", "gix-traverse/tests", - "gix-shallow" + "gix-shallow", + "brit-epr", + "brit-verify", + "brit-build-ref", + "brit-graph", + "brit-cli", + "tests/cli-journey", + "tests/cli-test-page", ] [workspace.dependencies] diff --git a/README.md b/README.md index d5186327197..4f183f9980a 100644 --- a/README.md +++ b/README.md @@ -1,460 +1,162 @@ -[![CI](https://github.com/GitoxideLabs/gitoxide/workflows/ci/badge.svg)](https://github.com/GitoxideLabs/gitoxide/actions) -[![Crates.io](https://img.shields.io/crates/v/gitoxide.svg)](https://crates.io/crates/gitoxide) - +# brit -`gitoxide` is an implementation of `git` written in Rust for developing future-proof applications which strive for correctness and -performance while providing a pleasant and unsurprising developer experience. +[![Contribute](https://www.eclipse.org/che/contribute.svg)](https://code.ethosengine.com/#https://github.com/ethosengine/brit) -There are two primary ways to use `gitoxide`: +**Brit** (בְּרִית, "covenant") is an expansion of [gitoxide](https://github.com/GitoxideLabs/gitoxide) — a pure-Rust implementation of git — that integrates protocol-level primitives for tracking who built code, what value it creates, and who governs it. Every commit in a brit repo is a covenant: a witnessed agreement whose terms travel with the code, no matter where it goes. -1. **As Rust library**: Use the [`gix`](https://docs.rs/gix) crate as a Cargo dependency for API access. -1. **As command-line tool**: The `gix` binary as development tool to help testing the API in real repositories, - and the `ein` binary with workflow-enhancing tools. Both binaries may forever be unstable, - *do not rely on them in scripts*. - -[![asciicast](etc/gix-asciicast.svg)](https://asciinema.org/a/542159) - -[`gix`]: https://docs.rs/gix - -## Development Status - -The command-line tools as well as the status of each crate is described in -[the crate status document](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md). - -For use in applications, look for the [`gix`](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix) crate, -which serves as entrypoint to the functionality provided by various lower-level plumbing crates like -[`gix-config`](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-config). - -### Feature Discovery - -> Can `gix` do what I need it to do? - -The above can be hard to answer and this paragraph is here to help with feature discovery. - -Look at [`crate-status.md`](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md) for a rather exhaustive document that contains -both implemented and planned features. - -Further, the [`gix` crate documentation with the `git2` search term](https://docs.rs/gix/latest/gix?search=git2) helps to find all currently -known `git2` equivalent method calls. Please note that this list is definitely not exhaustive yet, but might help if you are coming from `git2`. - -What follows is a high-level list of features and those which are planned: - -* [x] clone -* [x] fetch -* [ ] push -* [x] blame (*plumbing*) -* [x] status -* [x] blob and tree-diff -* [ ] merge - - [x] blobs - - [x] trees - - [ ] commits -* [x] commit - - [ ] hooks -* [x] commit-graph traversal -* [ ] rebase -* [x] worktree checkout and worktree stream -* [ ] reset -* [x] reading and writing of objects -* [x] reading and writing of refs -* [x] reading and writing of `.git/index` -* [x] reading and writing of git configuration -* [x] pathspecs -* [x] revspecs -* [x] `.gitignore` and `.gitattributes` - -### Crates - -Follow linked crate name for detailed status. Please note that all crates follow [semver] as well as the [stability guide]. - -[semver]: https://semver.org - -### Production Grade - -* **Stability Tier 1** - - [gix-lock](https://github.com/GitoxideLabs/gitoxide/blob/main/gix-lock/README.md) - -* **Stability Tier 2** - - [gix-tempfile](https://github.com/GitoxideLabs/gitoxide/blob/main/gix-tempfile/README.md) - -### Stabilization Candidates - -Crates that seem feature complete and need to see some more use before they can be released as 1.0. -Documentation is complete and was reviewed at least once. - -* [gix-mailmap](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-mailmap) -* [gix-chunk](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-chunk) -* [gix-ref](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-ref) -* [gix-config](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-config) -* [gix-config-value](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-config-value) -* [gix-glob](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-glob) -* [gix-actor](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-actor) -* [gix-hash](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-hash) - -### Initial Development - -These crates may be missing some features and thus are somewhat incomplete, but what's there -is usable to some extent. - -* **usable** _(with rough but complete docs, possibly incomplete functionality)_ - * [gix](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix) (**⬅ entrypoint**) - * [gix-object](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-object) - * [gix-validate](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-validate) - * [gix-url](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-url) - * [gix-packetline](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-packetline) - * [gix-packetline-blocking](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-packetline) - * [gix-transport](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-transport) - * [gix-protocol](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-protocol) - * [gix-pack](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-pack) - * [gix-odb](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-odb) - * [gix-commitgraph](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-commitgraph) - * [gix-diff](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-diff) - * [gix-traverse](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-traverse) - * [gix-features](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-features) - * [gix-credentials](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-credentials) - * [gix-sec](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-sec) - * [gix-quote](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-quote) - * [gix-discover](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-discover) - * [gix-path](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-path) - * [gix-attributes](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-attributes) - * [gix-ignore](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-ignore) - * [gix-pathspec](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-pathspec) - * [gix-index](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-index) - * [gix-revision](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-revision) - * [gix-revwalk](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-revwalk) - * [gix-command](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-command) - * [gix-prompt](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-prompt) - * [gix-refspec](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-refspec) - * [gix-fs](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-fs) - * [gix-utils](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-utils) - * [gix-hashtable](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-hashtable) - * [gix-worktree](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-worktree) - * [gix-bitmap](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-bitmap) - * [gix-negotiate](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-negotiate) - * [gix-filter](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-filter) - * [gix-worktree-stream](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-worktree-stream) - * [gix-archive](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-archive) - * [gix-submodule](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-submodule) - * [gix-status](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-status) - * [gix-worktree-state](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-worktree-state) - * [gix-date](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-date) - * [gix-dir](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-dir) - * [gix-merge](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-merge) - * [gix-shallow](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-shallow) - * [gix-error](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-error) - * `gitoxide-core` -* **very early** _(possibly without any documentation and many rough edges)_ - * [gix-blame](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-blame) -* **idea** _(just a name placeholder)_ - * [gix-note](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-note) - * [gix-fetchhead](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-fetchhead) - * [gix-lfs](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-lfs) - * [gix-rebase](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-rebase) - * [gix-sequencer](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-sequencer) - * [gix-tui](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-tui) - * [gix-tix](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-tix) - * [gix-bundle](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-bundle) - * [gix-fsck](https://github.com/GitoxideLabs/gitoxide/blob/main/crate-status.md#gix-fsck) - -### Stress Testing - * [x] Verify huge packs - * [x] Explode a pack to disk - * [x] Generate and verify large commit graphs - * [ ] Generate huge pack from a lot of loose objects - -### Stability and MSRV - -Our [stability guide] helps to judge how much churn can be expected when depending on crates in this workspace. - -[stability guide]: https://github.com/GitoxideLabs/gitoxide/blob/main/STABILITY.md - -## Installation - -### Download a Binary Release - -Using `cargo binstall`, one is able to fetch [binary releases][releases]. You can install it via `cargo install cargo-binstall`, assuming -the [rust toolchain][rustup] is present. - -Then install gitoxide with `cargo binstall gitoxide`. - -See the [releases section][releases] for manual installation and various alternative builds that are _slimmer_ or _smaller_, depending -on your needs, for _Linux_, _MacOS_ and _Windows_. - -[releases]: https://github.com/GitoxideLabs/gitoxide/releases - -### Download from Arch Linux repository - -For Arch Linux you can download `gitoxide` from `community` repository: - -```sh -pacman -S gitoxide -``` +The name rhymes with *git* on purpose. Git is the substrate. Brit is the covenant laid on top. -### Download from Exherbo Linux Rust repository +A brit repo is a valid git repo. You can `git clone` it from GitHub. You can push it to GitLab, Codeberg, sourcehut. Everything works. But inside the [Elohim Protocol](https://github.com/ethosengine/elohim) network, the same repo resolves to a richer view: provenance, economic events, governance context, and content-addressed links that know where your code is running. -For Exherbo Linux you can download `gitoxide` from the [Rust](https://gitlab.exherbo.org/exherbo/rust/-/tree/master/packages/dev-scm/gitoxide) repository: +## Why this exists -```sh -cave resolve -x repository/rust -cave resolve -x gitoxide -``` +### The problem: power is siloed -### From Source via Cargo +The world has three forms of power, and today they're separated: -`cargo` is the Rust package manager which can easily be obtained through [rustup]. With it, you can build your own binary -effortlessly and for your particular CPU for additional performance gains. +- **Economic power** — money, wealth, capital. Concentrated in institutions that extract value from the systems they control. +- **Informational power** — knowledge, data, distribution. Concentrated in platforms that control what you see and who sees you. +- **Social and network power** — trust, governance, collective decision-making. Concentrated in corporations and governments that make rules for everyone while being accountable to almost no one. -The minimum supported Rust version is [documented in the Cargo package](https://github.com/GitoxideLabs/gitoxide/blob/main/gix/Cargo.toml#L12-L14), -the latest stable one will work as well. +These silos aren't accidental. They're profitable. When economic power is decoupled from the knowledge it was built on, you get proprietary lock-in. When informational power is decoupled from governance, you get surveillance capitalism. When social power is decoupled from economic accountability, you get institutions that privatize gains and socialize costs. -There are various build configurations, all of them are [documented here](https://docs.rs/crate/gitoxide/latest). The documentation should also be useful -for packagers who need to tune external dependencies. +Every open-source project lives at the intersection of all three — code is knowledge (informational), contributors create value (economic), and maintainers make decisions for everyone who depends on them (governance) — but git, the tool that tracks it all, knows about exactly *none* of it. Git tracks content. It doesn't track value. It doesn't track governance. It doesn't even reliably track who contributed what, beyond a name and email in a commit header. -```sh -# A way to install `gitoxide` with just Rust and a C compiler installed. -# If there are problems with SSL certificates during clones, try to omit `--locked`. -cargo install gitoxide --locked --no-default-features --features max-pure +### The solution: couple them at the protocol level -# The default installation, 'max', is the fastest, but also needs `cmake` to build successfully. -# Installing it is platform-dependent. -cargo install gitoxide +The [Elohim Protocol](https://github.com/ethosengine/elohim) introduces three coupled primitives — **lamad** (knowledge), **shefa** (value), and **qahal** (governance) — and requires that every notarized artifact in the network carries all three. You cannot create a content-addressed artifact that declares what it is without also declaring who stewards it and what governance applies. The architecture makes it structurally difficult to circulate knowledge without recognizing its stewards, and structurally easy to honor their care. -# For smaller binaries and even faster build times that are traded for a less fancy CLI implementation, -# use the `lean` feature. -cargo install gitoxide --locked --no-default-features --features lean -``` +Brit brings this coupling to version control. -The following installs the latest unpublished `max` release directly from git: +## What this means for code -```sh -cargo install --git https://github.com/GitoxideLabs/gitoxide gitoxide -``` +### 1. A way to pay the open source contributor, built in -#### How to deal with build failures +Today, open source runs on unpaid labor. Contributions are tracked by git (author, committer), but the economic relationship between contribution and value is invisible to the tooling. Payment is an afterthought — a GitHub Sponsors button, a Patreon link, a corporate donation. None of it is wired into the act of building. -On some platforms, installation may fail due to lack of tools required by *C* toolchains. This can generally be avoided by installation with: +In a brit repo, every commit carries a **shefa** trailer that declares the economic event: who contributed, what kind of work it was, what stewardship changed. When someone builds your package, the protocol's economic layer records a recognition event — not a financial transaction, but a protocol-level acknowledgment that serving knowledge generates value for those who care for it. Recognition flows proportionally to stewards based on their allocation. -```sh -cargo install gitoxide --no-default-features --features max-pure -``` +This isn't "add a token to npm." This is the substrate knowing, at the commit level, that contribution has value and tracking it the same way git tracks authorship: as a first-class primitive that travels with the code. -What follows is a list of known failures. +### 2. Provenance-aware code — choose who you trust, not just what you run -- On Fedora, `perl` needs to be installed for `OpenSSL` to build properly. This can be done with the following command (see [issue #592](https://github.com/GitoxideLabs/gitoxide/issues/592)): +Here's a thought experiment. Imagine there's a critical piece of infrastructure — call it a cloud platform — built by a large corporation. The code is open source. You can read every line. But the corporation starts doing things you disagree with: surveillance, labor violations, environmental harm. You want to keep using the code, but you don't want your usage to legitimize their stewardship. - ```sh - dnf install perl - ``` +Today, you fork the repo on GitHub and hope people notice. The fork has no formal relationship to the original. No one can tell, from the code alone, whether your fork is a legitimate community effort or a fly-by-night copy. -### Using Docker +With brit, a fork is a **first-class covenant** — a new `ForkContentNode` with its own stewardship, its own attestations, its own peers. The code is the same; the stewardship graph is different. When you choose to depend on Coop AWS instead of Amazon AWS, that choice is visible on the protocol's content graph. Your dependency isn't just a semver string in a lockfile — it's an EPR reference that points at specific stewards, specific attestations, specific governance. Everyone on the graph can see which collective you're trusting, and every steward can independently attest that the tags and branches they serve have the integrity needed for deployment. -Some CI/CD pipelines leverage repository cloning. Below is a copy-paste-able example to build docker images for such workflows. -As no official image exists (at this time), an image must first be built. +Provenance isn't metadata bolted on after the fact. It's the address. -> [!NOTE] -> The dockerfile isn't continuously tested as it costs too much time and thus might already be broken. -> PRs are welcome. +### 3. Deployment-aware code — links that know where they're running -#### Building the most compatible base image +Have you ever thought it would be nice if a config reference could resolve differently depending on which environment you're in? Or if a link in your documentation could point at staging when you're on the `dev` branch and production when you're on `main`? -```sh -docker build -f etc/docker/Dockerfile.alpine -t gitoxide:latest --compress . --target=pipeline -``` +With an Elohim Protocol Reference (EPR) link, now it can. An EPR is a content address that carries context: `epr:my-service[@v2.1.0][/head][?via=doorway.example.org]`. The same link, in a brit repo, resolves differently based on: -#### Basic usage in a Pipeline +- **Which branch you're on** — each branch has a reach level (`private`, `self`, `trusted`, `familiar`, `community`, `public`, `commons`) that determines who sees it and what it resolves to. +- **Which doorway you're connected to** — a doorway is a gateway node that bridges web2 (GitHub, GitLab) and the protocol network. Your doorway knows your environment. +- **Who's asking** — the protocol's context-aware resolution adapts to the requester's position in the knowledge graph. -For example, if a `Dockerfile` currently uses something like `RUN git clone https://github.com/GitoxideLabs/gitoxide`, first build the image: +Code is no longer limited to a SHA graph address. It's a living artifact in a network that knows what it is, who built it, and where it's running. -```sh -docker build -f etc/docker/Dockerfile.alpine -t gitoxide:latest --compress . -``` +### 4. A fully distributed landing — not just another crypto project + +Under the hood, brit uses [IPFS/IPLD](https://ipld.io/) primitives through [rust-ipfs](https://github.com/ethosengine/rust-ipfs) to take the actual blobs of a codebase and place them on a distributed content-addressed graph. Every tree, every blob, every commit object gets a CID (content identifier) that any peer can resolve. The codebase isn't hosted on a server you hope stays up — it's distributed across a network of peers who can independently verify every byte. + +Other P2P and crypto projects do this too. IPFS, Radicle, and various blockchain-based package registries all make code content-addressed and peer-distributed. -Then copy the binaries into your image and replace the `git` directive with a `gix` equivalent. +What makes brit different is *where the code lands*. -```dockerfile -COPY --from gitoxide:latest /bin/gix /usr/local/bin/ -COPY --from gitoxide:latest /bin/ein /usr/local/bin/ +Most distributed code projects land in a network optimized for financial incentives — mine tokens, stake coins, speculate on protocol value. The network exists to create economic returns for participants. Code is the payload; speculation is the purpose. + +Brit lands in the Elohim Protocol network — a network designed to scale **wisdom and care**: the human capacity to steward shared resources responsibly. The three pillars (knowledge, value, governance) are coupled at the substrate level specifically so that code can't circulate without acknowledging who cares for it, and stewardship can't accumulate without the community's consent. The network exists to serve the humans who depend on the code, not to create returns for token holders. + +This is not a philosophical distinction. It's an architectural one. The same content-addressing that makes code distributed also makes stewardship trackable, governance enforceable, and value flows transparent — but only if the network those primitives land in is *designed for care rather than extraction*. A content-addressed blob on a speculation-optimized network is still a blob someone will try to rent-seek from. A content-addressed blob on a care-optimized network is a shared resource the community can actually govern. + +## How it works + +### Commit trailers — the protocol surface + +Every brit commit carries three trailer lines in its message, using the same RFC-822 format as `Signed-off-by:`: -RUN /usr/local/bin/gix clone --depth 1 https://github.com/GitoxideLabs/gitoxide gitoxide ``` +feat: add two-factor auth to login flow +Implements TOTP-based 2FA with QR code provisioning and backup codes. -[releases]: https://github.com/GitoxideLabs/gitoxide/releases -[rustup]: https://rustup.rs - -## Usage - -Once installed, there are two binaries: - -* **ein** - * high level commands, _porcelain_, for every-day use, optimized for a pleasant user experience -* **gix** - * low level commands, _plumbing_, for use in more specialized cases and to validate newly written code in real-world scenarios - -## Project Goals - -Project goals can change over time as we learn more, and they can be challenged. - - * **a pure-rust implementation of git** - * including *transport*, *object database*, *references*, *cli* and *tui* - * a simple command-line interface is provided for the most common git operations, optimized for - user experience. A *simple-git* if you so will. - * be the go-to implementation for anyone who wants to solve problems around git, and become - *the* alternative to `GitPython` and *libgit2* in the process. - * become the foundation for a distributed alternative to GitHub, and maybe even for use within GitHub itself - * **learn from the best to write the best possible idiomatic Rust** - * *libgit2* is a fantastic resource to see what abstractions work, we will use them - * use Rust's type system to make misuse impossible - * **be the best performing implementation** - * use Rust's type system to optimize for work not done without being hard to use - * make use of parallelism from the get go - * _sparse checkout_ support from day one - * **assure on-disk consistency** - * assure reads never interfere with concurrent writes - * assure multiple concurrent writes don't cause trouble - * **take shortcuts, but not in quality** - * binaries may use `anyhow::Error` exhaustively, knowing these errors are solely user-facing. - * libraries use light-weight custom errors implemented using `quick-error` or `thiserror`. - * internationalization is nothing we are concerned with right now. - * IO errors due to insufficient amount of open file handles don't always lead to operation failure - * **Cross platform support, including Windows** - * With the tools and experience available here there is no reason not to support Windows. - * [Windows is tested on CI](https://github.com/GitoxideLabs/gitoxide/blob/df66d74aa2a8cb62d8a03383135f08c8e8c579a8/.github/workflows/rust.yml#L34) - and failures do prevent releases. - -## Non-Goals - -Project non-goals can change over time as we learn more, and they can be challenged. - - * **replicate `git` command functionality perfectly** - * `git` is `git`, and there is no reason to not use it. Our path is the one of simplicity to make - getting started with git easy. - * **be incompatible to git** - * the on-disk format must remain compatible, and we will never contend with it. - * **use async IO everywhere** - * for the most part, git operations are heavily reliant on memory mapped IO as well as CPU to decompress data, - which doesn't lend itself well to async IO out of the box. - * Use `blocking` as well as `gix-features::interrupt` to bring operations into the async world and to control - long running operations. - * When connecting or streaming over TCP connections, especially when receiving on the server, async seems like a must - though, but behind a feature flag. - -## Contributions - -If what you have seen so far sparked your interest to contribute, then let us say: We are happy to have you and help you to get started. - -We recommend running `just test` during the development process to assure CI is green before pushing. - -A backlog for work ready to be picked up is [available in the Project's Kanban board][project-board], which contains instructions on how -to pick a task. If it's empty or you have other questions, feel free to [start a discussion][discussions] or reach out to @Byron [privately][keybase]. - -For additional details, also take a look at the [collaboration guide]. - -[collaboration guide]: https://github.com/GitoxideLabs/gitoxide/blob/main/COLLABORATING.md -[project-board]: https://github.com/GitoxideLabs/gitoxide/projects -[discussions]: https://github.com/GitoxideLabs/gitoxide/discussions -[keybase]: https://keybase.io/byronbates -[cargo-diet]: https://crates.io/crates/cargo-diet - -### Getting started with Video Tutorials - -- [Learning Rust with Gitoxide](https://youtube.com/playlist?list=PLMHbQxe1e9Mk5kOHrm9v20-umkE2ck_gE) - - In 17 episodes you can learn all you need to meaningfully contribute to `gitoxide`. -- [Getting into Gitoxide](https://youtube.com/playlist?list=PLMHbQxe1e9MkEmuj9csczEK1O06l0Npy5) - - Get an introduction to `gitoxide` itself which should be a good foundation for any contribution, but isn't a requirement for contributions either. -- [Gifting Gitoxide](https://www.youtube.com/playlist?list=PLMHbQxe1e9MlhyyZQXPi_dc-bKudE-WUw) - - See how PRs are reviewed along with a lot of inner monologue. - -#### Other Media - -- [Rustacean Station Podcast](https://rustacean-station.org/episode/055-sebastian-thiel/) - -## Roadmap - -### Features for 1.0 - -Provide a CLI to for the most basic user journey: - -* [x] initialize a repository -* [x] fetch - * [ ] and update worktree -* clone a repository - - [ ] bare - - [ ] with working tree -* [ ] create a commit after adding worktree files -* [x] add a remote -* [ ] push - * [x] create (thin) pack - -### Ideas for Examples - -* [ ] `gix tool open-remote` open the URL of the remote, possibly after applying known transformations to go from `ssh` to `https`. -* [ ] `tix` as example implementation of `tig`, displaying a version of the commit graph, useful for practicing how highly responsive GUIs can be made. -* [ ] Something like [`git-sizer`](https://github.com/github/git-sizer), but leveraging extreme decompression speeds of indexed packs. -* [ ] Open up SQL for git using [sqlite virtual tables](https://github.com/rusqlite/rusqlite/blob/master/tests/vtab.rs). Check out gitqlite - as well. What would an MVP look like? Maybe even something that could ship with gitoxide. See [this go implementation as example](https://github.com/filhodanuvem/gitql). -* [ ] A truly awesome history rewriter which makes it easy to understand what happened while avoiding all pitfalls. Think BFG, but more awesome, if that's possible. -* [ ] `gix-tui` should learn a lot from [fossil-scm] regarding the presentation of data. Maybe [this](https://github.com/Lutetium-Vanadium/requestty/) can be used for prompts. Probably [magit] has a lot to offer, too. - -### Ideas for Spin-Offs - -* [ ] A system to integrate tightly with `gix-lfs` to allow a multi-tier architecture so that assets can be stored in git and are accessible quickly from an intranet location - (for example by accessing the storage read-only over the network) while changes are pushed immediately by the server to other edge locations, like _the cloud_ or backups. Sparse checkouts along with explorer/finder integrations - make it convenient to only work on a small subset of files locally. Clones can contain all configuration somebody would need to work efficiently from their location, - and authentication for the git history as well as LFS resources make the system secure. One could imagine encryption support for untrusted locations in _the cloud_ - even though more research would have to be done to make it truly secure. -* [ ] A [syncthing] like client/server application. This is to demonstrate how lower-level crates can be combined into custom applications that use - only part of git's technology to achieve their very own thing. Watch out for big file support, multi-device cross-syncing, the possibility for - untrusted destinations using full-encryption, case-insensitive and sensitive filesystems, and extended file attributes as well as ignore files. -* An event-based database that uses commit messages to store deltas, while occasionally aggregating the actual state in a tree. Of course it's distributed by nature, allowing - people to work offline. - - It's abstracted to completely hide the actual data model behind it, allowing for all kinds of things to be implemented on top. - - Commits probably need a nanosecond component for the timestamp, which can be added via custom header field. - - having recording all changes allows for perfect merging, both on the client or on the server, while keeping a natural audit log which makes it useful for mission critical - databases in business. - * **Applications** - - Can markdown be used as database so issue-trackers along with meta-data could just be markdown files which are mostly human-editable? Could user interfaces - be meta-data aware and just hide the meta-data chunks which are now editable in the GUI itself? Doing this would make conflicts easier to resolve than an `sqlite` - database. - - A time tracker - simple data, very likely naturally conflict free, and interesting to see it in terms of teams or companies using it with maybe GitHub as Backing for authentication. - - How about supporting multiple different trackers, as in different remotes? - -[syncthing]: https://github.com/syncthing/syncthing -[fossil-scm]: https://www.fossil-scm.org -[magit]: https://magit.vc - -## Shortcomings & Limitations - -Please take a look at the [`SHORTCOMINGS.md` file](https://github.com/GitoxideLabs/gitoxide/blob/main/SHORTCOMINGS.md) for details. - -## Credits - -* **itertools** _(MIT Licensed)_ - * We use the `izip!` macro in code -* **flate2** _(MIT Licensed)_ - * We use the high-level `flate2` library to implement decompression and compression, which builds on the high-performance `zlib-rs` crate. +Signed-off-by: Dan +Lamad: teaches two-factor-auth pattern; advances auth learning path +Shefa: human contributor | effort=medium | stewards=dan,sofia +Qahal: steward | mechanism=self-review | ref=refs/heads/dev +``` -## 🙏 Special Thanks 🙏 +Stock git reads this commit just fine. GitHub renders it. `git log` prints it. Nothing breaks. But a brit-aware tool (or an LLM agent with a brit skill) knows that this commit teaches something (`Lamad`), that Dan and Sofia steward the value it creates (`Shefa`), and that it was self-reviewed for merge to `dev` (`Qahal`). -At least for now this section is exclusive to highlight the incredible support that [Josh Triplett](https://github.com/joshtriplett) has provided to me -in the form of advice, sponsorship and countless other benefits that were incredibly meaningful. Going full time with `gitoxide` would hardly have been -feasible without his involvement, and I couldn't be more grateful 😌. +### Backward-compatible with every git host -## License +A brit repo is a git repo. `git clone https://github.com/your-org/your-brit-repo` works from any machine with stock git. Outside the Elohim Protocol network, you get the full commit history with the trailer lines — readable, diffable, `git log --format=fuller` compatible. You lose the EPR resolution (linked ContentNodes, rich provenance graph, deployment-aware links) because those live on the protocol network, but nothing is broken. The code works. The trailers are there. The provenance is readable. + +Inside the network, a file called `.brit/doorway.toml` in the repo points at the primary steward's doorway node. That doorway resolves the full EPR view — linked ContentNodes for each commit, per-branch README ContentNodes, attestation graphs, economic event streams, and context-aware link resolution. + +### Engine and app schema — pluggable by design -This project is licensed under either of +The `brit-epr` crate has two layers: - * Apache License, Version 2.0, ([LICENSE-APACHE](LICENSE-APACHE) or - http://www.apache.org/licenses/LICENSE-2.0) - * MIT license ([LICENSE-MIT](LICENSE-MIT) or - http://opensource.org/licenses/MIT) +- **Engine** (unconditional) — a generic covenant engine that parses trailer blocks, validates them against an `AppSchema` trait, and manages `TrailerSet` types. Knows nothing about Lamad, Shefa, or Qahal specifically. +- **Elohim Protocol schema** (feature-gated, default on) — the first-party implementation of `AppSchema` for the Elohim Protocol's three pillars. -at your option. +A downstream project could disable the `elohim-protocol` feature and plug in a different schema — a carbon-accounting protocol, a biological-sequence protocol, a music-composition protocol — without forking brit. The engine is the covenant substrate; the schema is the vocabulary. -## Fun facts +## Current status + +**Phase 1 complete** (trailer foundation): + +- `brit-epr` crate with engine/elohim feature split +- `AppSchema` trait — the dispatch contract for app schemas +- `TrailerSet` type and `parse_trailer_block` via gitoxide's `gix-object` +- `ElohimProtocolSchema` implementing `AppSchema` with closed Lamad/Shefa/Qahal vocabulary +- `parse_pillar_trailers` and `validate_pillar_trailers` convenience functions +- `brit-verify` binary — verifies pillar trailers on a commit, exits 0/1 +- 9 tests passing; engine compiles cleanly with `--no-default-features` + +**Phases 2-6** (planned, not yet implemented): ContentNode adapter, libp2p transport, per-branch READMEs, DHT peer discovery, merge-as-reach-elevation with async consent, fork-as-governance. See [docs/plans/README.md](docs/plans/README.md) for the roadmap. + +## Quick start + +```bash +# Build +cargo build -p brit-verify + +# Verify a commit's pillar trailers +cargo run -p brit-verify -- HEAD + +# Expected (on a brit-aware commit): +# ✓ pillar trailers valid for abc1234 +# Lamad: teaches two-factor-auth pattern +# Shefa: human contributor | effort=medium | stewards=dan,sofia +# Qahal: steward | mechanism=self-review | ref=refs/heads/dev + +# Expected (on a stock gitoxide commit): +# ✗ pillar validation failed for abc1234: required pillar trailer missing: Lamad +``` + +## Relationship to gitoxide + +Brit is a fork of [gitoxide](https://github.com/GitoxideLabs/gitoxide) by Sebastian Thiel and contributors. Gitoxide is an excellent pure-Rust git implementation with a clean modular design — each concern lives in its own `gix-*` crate and swaps independently. Brit builds on that modularity. + +**What brit adds:** new crates (`brit-epr`, `brit-verify`, and future `brit-cli`, `brit-transport`, `brit-store`) that layer protocol semantics onto gitoxide's object model. Zero modifications to existing `gix-*` crates. The goal is to remain upstream-rebaseable: bug fixes and additive extension points are proposed upstream where possible; protocol-specific divergence earns its own crate. + +**What brit does not change:** gitoxide's core — object storage, pack format, protocol negotiation, ref management, diff, blame, worktree. Brit consumes these; it doesn't rewrite them. + +## Further reading + +- **[EPR-git roadmap](docs/plans/README.md)** — seven-phase plan from trailer foundation through fork-as-governance +- **[App-level schema design](docs/schemas/elohim-protocol-manifest.md)** — the normative reference for ContentNode types, trailer grammar, signal catalog, and the engine/app-schema boundary +- **[Merge consent critique](docs/schemas/reviews/2026-04-11-merge-consent-critique.md)** — pressure test of async-default merge design against distributed stewardship scenarios +- **[Elohim Protocol](https://github.com/ethosengine/elohim)** — the parent protocol repository +- **[gitoxide](https://github.com/GitoxideLabs/gitoxide)** — the upstream Rust git implementation brit is built on + +## License -* Originally @Byron was really fascinated by [this problem](https://github.com/gitpython-developers/GitPython/issues/765#issuecomment-396072153) - and believes that with `gitoxide` it will be possible to provide the fastest solution for it. -* @Byron has been absolutely blown away by `git` from the first time he experienced git more than 13 years ago, and - tried to implement it in [various shapes](https://github.com/gitpython-developers/GitPython/pull/1028) and [forms](https://github.com/byron/gogit) - multiple [times](https://github.com/Byron/gitplusplus). Now with Rust @Byron finally feels to have found the right tool for the job! +MIT OR Apache-2.0, following gitoxide's dual license. diff --git a/brit-build-ref/Cargo.toml b/brit-build-ref/Cargo.toml new file mode 100644 index 00000000000..803659ac023 --- /dev/null +++ b/brit-build-ref/Cargo.toml @@ -0,0 +1,22 @@ +lints.workspace = true + +[package] +name = "brit-build-ref" +version = "0.0.0" +description = "Manage build, deploy, and validation attestation refs in a brit repo" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[[bin]] +name = "brit-build-ref" +path = "src/main.rs" + +[dependencies] +brit-epr = { version = "^0.0.0", path = "../brit-epr" } +clap = { version = "4", features = ["derive"] } +serde_json = "1" +chrono = { version = "0.4", default-features = false, features = ["clock"] } +anyhow = "1" diff --git a/brit-build-ref/src/build_cmd.rs b/brit-build-ref/src/build_cmd.rs new file mode 100644 index 00000000000..03018c639dc --- /dev/null +++ b/brit-build-ref/src/build_cmd.rs @@ -0,0 +1,82 @@ +//! `build` subcommand — put/get/list build attestation refs. + +use std::path::Path; + +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use brit_epr::engine::signing::AgentKey; +use brit_epr::elohim::attestation::build::BuildAttestationContentNode; +use brit_epr::elohim::refs::BritRefManager; + +#[allow(clippy::too_many_arguments)] +pub fn put( + repo: &Path, + step: &str, + manifest_cid: &str, + output_cid: &str, + inputs_hash: &str, + success: bool, + hardware: &str, + duration_ms: u64, + commit: &str, +) -> anyhow::Result<()> { + let git_dir = repo.join(".git"); + let key_path = git_dir.join("brit").join("agent-key"); + let agent_key = AgentKey::load_or_generate(&key_path)?; + let store = LocalObjectStore::for_git_dir(&git_dir); + let refs = BritRefManager::new(repo)?; + + let hardware_profile: serde_json::Value = serde_json::from_str(hardware) + .map_err(|e| anyhow::anyhow!("invalid --hardware JSON: {e}"))?; + + let manifest_cid: brit_epr::engine::cid::BritCid = manifest_cid + .parse() + .map_err(|e| anyhow::anyhow!("invalid --manifest CID: {e:?}"))?; + let output_cid: brit_epr::engine::cid::BritCid = output_cid + .parse() + .map_err(|e| anyhow::anyhow!("invalid --output CID: {e:?}"))?; + + let built_at = chrono::Utc::now().to_rfc3339(); + + // Build with empty signature, compute canonical JSON, sign, then set signature. + let mut node = BuildAttestationContentNode { + manifest_cid, + step_name: step.to_string(), + inputs_hash: inputs_hash.to_string(), + output_cid, + agent_id: agent_key.agent_id(), + hardware_profile, + build_duration_ms: duration_ms, + built_at, + success, + signature: String::new(), + }; + + let canonical = node.canonical_json()?; + node.signature = agent_key.sign(&canonical); + + let cid = store.put(&node)?; + + let payload = serde_json::to_value(&node)?; + refs.put_build_ref(step, commit, &payload)?; + + println!("{cid}"); + Ok(()) +} + +pub fn get(repo: &Path, step: &str, commit: &str) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + match refs.get_build_ref(step, commit)? { + Some(v) => println!("{}", serde_json::to_string_pretty(&v)?), + None => eprintln!("no build ref found for step={step} commit={commit}"), + } + Ok(()) +} + +pub fn list(repo: &Path) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + for name in refs.list_build_refs(None)? { + println!("{name}"); + } + Ok(()) +} diff --git a/brit-build-ref/src/deploy_cmd.rs b/brit-build-ref/src/deploy_cmd.rs new file mode 100644 index 00000000000..84f2d67c24d --- /dev/null +++ b/brit-build-ref/src/deploy_cmd.rs @@ -0,0 +1,76 @@ +//! `deploy` subcommand — put/get/list deploy attestation refs. + +use std::path::Path; + +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use brit_epr::engine::signing::AgentKey; +use brit_epr::elohim::attestation::deploy::{DeployAttestationContentNode, HealthStatus}; +use brit_epr::elohim::refs::BritRefManager; + +#[allow(clippy::too_many_arguments)] +pub fn put( + repo: &Path, + step: &str, + env: &str, + artifact_cid: &str, + endpoint: &str, + health_check_epr: &str, + health: HealthStatus, + ttl: u64, +) -> anyhow::Result<()> { + let git_dir = repo.join(".git"); + let key_path = git_dir.join("brit").join("agent-key"); + let agent_key = AgentKey::load_or_generate(&key_path)?; + let store = LocalObjectStore::for_git_dir(&git_dir); + let refs = BritRefManager::new(repo)?; + + let artifact_cid: brit_epr::engine::cid::BritCid = artifact_cid + .parse() + .map_err(|e| anyhow::anyhow!("invalid --artifact CID: {e:?}"))?; + + let now = chrono::Utc::now().to_rfc3339(); + + let mut node = DeployAttestationContentNode { + artifact_cid, + step_name: step.to_string(), + environment_label: env.to_string(), + endpoint: endpoint.to_string(), + health_check_epr: health_check_epr.to_string(), + health_status: health, + deployed_at: now.clone(), + attested_at: now, + liveness_ttl_sec: ttl, + agent_id: agent_key.agent_id(), + signature: String::new(), + }; + + let canonical = node.canonical_json()?; + node.signature = agent_key.sign(&canonical); + + store.put(&node)?; + + let payload = serde_json::to_value(&node)?; + refs.put_deploy_ref(step, env, &payload)?; + + let cid = node.compute_cid()?; + println!("{cid}"); + Ok(()) +} + +pub fn get(repo: &Path, step: &str, env: &str) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + match refs.get_deploy_ref(step, env)? { + Some(v) => println!("{}", serde_json::to_string_pretty(&v)?), + None => eprintln!("no deploy ref found for step={step} env={env}"), + } + Ok(()) +} + +pub fn list(repo: &Path) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + for name in refs.list_deploy_refs(None)? { + println!("{name}"); + } + Ok(()) +} diff --git a/brit-build-ref/src/main.rs b/brit-build-ref/src/main.rs new file mode 100644 index 00000000000..63685ee3849 --- /dev/null +++ b/brit-build-ref/src/main.rs @@ -0,0 +1,240 @@ +//! `brit-build-ref` — CLI for build/deploy/validate/reach attestation refs. + +use std::path::PathBuf; + +use clap::{Parser, Subcommand}; + +mod build_cmd; +mod deploy_cmd; +mod reach_cmd; +mod validate_cmd; + +#[derive(Parser)] +#[command(name = "brit-build-ref", about = "Manage build/deploy/validate/reach attestation refs")] +struct Cli { + /// Path to the git repository (default: current directory). + #[arg(long, default_value = ".")] + repo: PathBuf, + + #[command(subcommand)] + command: TopCommand, +} + +#[derive(Subcommand)] +enum TopCommand { + /// Build attestation refs. + Build { + #[command(subcommand)] + cmd: BuildCmd, + }, + /// Deploy attestation refs. + Deploy { + #[command(subcommand)] + cmd: DeployCmd, + }, + /// Validation attestation refs. + Validate { + #[command(subcommand)] + cmd: ValidateCmd, + }, + /// Reach level computation. + Reach { + #[command(subcommand)] + cmd: ReachCmd, + }, +} + +// ─── Build subcommands ──────────────────────────────────────────────────────── + +#[derive(Subcommand)] +enum BuildCmd { + /// Record a build attestation. + Put { + /// Pipeline step name. + #[arg(long)] + step: String, + /// CID of the build manifest. + #[arg(long)] + manifest: String, + /// CID of the artifact produced. + #[arg(long)] + output: String, + /// Content hash of all declared inputs at build time. + #[arg(long)] + inputs_hash: String, + /// Whether the build succeeded. + #[arg(long, default_value_t = true)] + success: bool, + /// Hardware profile JSON. + #[arg(long, default_value = "{}")] + hardware: String, + /// Build duration in milliseconds. + #[arg(long, default_value_t = 0)] + duration_ms: u64, + /// Git commit revision. + #[arg(long, default_value = "HEAD")] + commit: String, + }, + /// Retrieve a build attestation. + Get { + /// Pipeline step name. + #[arg(long)] + step: String, + /// Git commit revision. + #[arg(long, default_value = "HEAD")] + commit: String, + }, + /// List all build attestation step names. + List, +} + +// ─── Deploy subcommands ─────────────────────────────────────────────────────── + +#[derive(Subcommand)] +enum DeployCmd { + /// Record a deploy attestation. + Put { + /// Pipeline step name. + #[arg(long)] + step: String, + /// Deployment environment label. + #[arg(long)] + env: String, + /// CID of the artifact deployed. + #[arg(long)] + artifact: String, + /// Base URL of the deployed service. + #[arg(long)] + endpoint: String, + /// EPR reference for the liveness health check. + #[arg(long)] + health_check_epr: String, + /// Health status (healthy|degraded|unreachable). + #[arg(long, default_value = "healthy")] + health: String, + /// Liveness TTL in seconds. + #[arg(long, default_value_t = 300)] + ttl: u64, + }, + /// Retrieve a deploy attestation. + Get { + /// Pipeline step name. + #[arg(long)] + step: String, + /// Deployment environment label. + #[arg(long)] + env: String, + }, + /// List all deploy attestation names. + List, +} + +// ─── Validate subcommands ───────────────────────────────────────────────────── + +#[derive(Subcommand)] +enum ValidateCmd { + /// Record a validation attestation. + Put { + /// Pipeline step name. + #[arg(long)] + step: String, + /// Check name (e.g. lint@v1). + #[arg(long)] + check: String, + /// CID of the artifact validated. + #[arg(long)] + artifact: String, + /// Validation result (pass|fail|warn|skip). + #[arg(long)] + result: String, + /// Human-readable result summary. + #[arg(long, default_value = "")] + summary: String, + /// Version of the validator tool. + #[arg(long, default_value = "")] + validator_version: String, + }, + /// Retrieve a validation attestation. + Get { + /// Pipeline step name. + #[arg(long)] + step: String, + /// Check name. + #[arg(long)] + check: String, + }, + /// List all validation attestation names. + List, +} + +// ─── Reach subcommands ──────────────────────────────────────────────────────── + +#[derive(Subcommand)] +enum ReachCmd { + /// Compute and store reach level for a step. + Compute { + /// Pipeline step name. + #[arg(long)] + step: String, + }, + /// Retrieve the stored reach level. + Get { + /// Pipeline step name. + #[arg(long)] + step: String, + }, +} + +// ─── Entry point ────────────────────────────────────────────────────────────── + +fn main() -> anyhow::Result<()> { + let cli = Cli::parse(); + let repo = cli.repo.canonicalize() + .unwrap_or_else(|_| cli.repo.clone()); + + match cli.command { + TopCommand::Build { cmd } => match cmd { + BuildCmd::Put { step, manifest, output, inputs_hash, success, hardware, duration_ms, commit } => { + build_cmd::put(&repo, &step, &manifest, &output, &inputs_hash, success, &hardware, duration_ms, &commit) + } + BuildCmd::Get { step, commit } => build_cmd::get(&repo, &step, &commit), + BuildCmd::List => build_cmd::list(&repo), + }, + + TopCommand::Deploy { cmd } => match cmd { + DeployCmd::Put { step, env, artifact, endpoint, health_check_epr, health, ttl } => { + use brit_epr::elohim::attestation::deploy::HealthStatus; + let health_status = match health.as_str() { + "healthy" => HealthStatus::Healthy, + "degraded" => HealthStatus::Degraded, + "unreachable" => HealthStatus::Unreachable, + other => anyhow::bail!("unknown --health value: {other} (expected healthy|degraded|unreachable)"), + }; + deploy_cmd::put(&repo, &step, &env, &artifact, &endpoint, &health_check_epr, health_status, ttl) + } + DeployCmd::Get { step, env } => deploy_cmd::get(&repo, &step, &env), + DeployCmd::List => deploy_cmd::list(&repo), + }, + + TopCommand::Validate { cmd } => match cmd { + ValidateCmd::Put { step, check, artifact, result, summary, validator_version } => { + use brit_epr::elohim::attestation::validation::ValidationResult; + let vr = match result.as_str() { + "pass" => ValidationResult::Pass, + "fail" => ValidationResult::Fail, + "warn" => ValidationResult::Warn, + "skip" => ValidationResult::Skip, + other => anyhow::bail!("unknown --result value: {other} (expected pass|fail|warn|skip)"), + }; + validate_cmd::put(&repo, &step, &check, &artifact, vr, &summary, &validator_version) + } + ValidateCmd::Get { step, check } => validate_cmd::get(&repo, &step, &check), + ValidateCmd::List => validate_cmd::list(&repo), + }, + + TopCommand::Reach { cmd } => match cmd { + ReachCmd::Compute { step } => reach_cmd::compute(&repo, &step), + ReachCmd::Get { step } => reach_cmd::get(&repo, &step), + }, + } +} diff --git a/brit-build-ref/src/reach_cmd.rs b/brit-build-ref/src/reach_cmd.rs new file mode 100644 index 00000000000..edce79bf2fa --- /dev/null +++ b/brit-build-ref/src/reach_cmd.rs @@ -0,0 +1,58 @@ +//! `reach` subcommand — compute/get reach level for a pipeline step. + +use std::path::Path; + +use brit_epr::elohim::attestation::reach::{compute_reach, ReachInput}; +use brit_epr::elohim::refs::BritRefManager; + +pub fn compute(repo: &Path, step: &str) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + + // Collect build attestations (step names from build refs). + let build_attestations = refs.list_build_refs(None)? + .into_iter() + .filter(|name| name == step || name.starts_with(&format!("{step}/"))) + .collect::>(); + + // Collect deploy attestations (env labels from deploy refs for this step). + let deploy_attestations = refs.list_deploy_refs(None)? + .into_iter() + .filter(|name| name.starts_with(&format!("{step}/"))) + .map(|name| name.trim_start_matches(&format!("{step}/")).to_string()) + .collect::>(); + + // Collect validation attestations (check names from validate refs for this step). + let validation_attestations = refs.list_validate_refs(None)? + .into_iter() + .filter(|name| name.starts_with(&format!("{step}/"))) + .map(|name| name.trim_start_matches(&format!("{step}/")).to_string()) + .collect::>(); + + let input = ReachInput { + build_attestations, + deploy_attestations, + validation_attestations, + }; + + let level = compute_reach(&input); + + let payload = serde_json::json!({ + "step": step, + "reach": level, + "computed_at": chrono::Utc::now().to_rfc3339(), + }); + + refs.put_reach_ref(step, &payload)?; + + println!("{}", serde_json::to_string_pretty(&payload)?); + Ok(()) +} + +pub fn get(repo: &Path, step: &str) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + match refs.get_reach_ref(step)? { + Some(v) => println!("{}", serde_json::to_string_pretty(&v)?), + None => eprintln!("no reach ref found for step={step}"), + } + Ok(()) +} diff --git a/brit-build-ref/src/validate_cmd.rs b/brit-build-ref/src/validate_cmd.rs new file mode 100644 index 00000000000..522fa6c468e --- /dev/null +++ b/brit-build-ref/src/validate_cmd.rs @@ -0,0 +1,73 @@ +//! `validate` subcommand — put/get/list validation attestation refs. + +use std::path::Path; + +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use brit_epr::engine::signing::AgentKey; +use brit_epr::elohim::attestation::validation::{ValidationAttestationContentNode, ValidationResult}; +use brit_epr::elohim::refs::BritRefManager; + +pub fn put( + repo: &Path, + step: &str, + check: &str, + artifact_cid: &str, + result: ValidationResult, + summary: &str, + validator_version: &str, +) -> anyhow::Result<()> { + let git_dir = repo.join(".git"); + let key_path = git_dir.join("brit").join("agent-key"); + let agent_key = AgentKey::load_or_generate(&key_path)?; + let store = LocalObjectStore::for_git_dir(&git_dir); + let refs = BritRefManager::new(repo)?; + + let artifact_cid: brit_epr::engine::cid::BritCid = artifact_cid + .parse() + .map_err(|e| anyhow::anyhow!("invalid --artifact CID: {e:?}"))?; + + let validated_at = chrono::Utc::now().to_rfc3339(); + + let mut node = ValidationAttestationContentNode { + artifact_cid, + check_name: check.to_string(), + validator_id: agent_key.agent_id(), + validator_version: if validator_version.is_empty() { "unknown".to_string() } else { validator_version.to_string() }, + result, + result_summary: summary.to_string(), + findings_cid: None, + validated_at, + ttl_sec: None, + signature: String::new(), + }; + + let canonical = node.canonical_json()?; + node.signature = agent_key.sign(&canonical); + + store.put(&node)?; + + let payload = serde_json::to_value(&node)?; + refs.put_validate_ref(step, check, &payload)?; + + let cid = node.compute_cid()?; + println!("{cid}"); + Ok(()) +} + +pub fn get(repo: &Path, step: &str, check: &str) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + match refs.get_validate_ref(step, check)? { + Some(v) => println!("{}", serde_json::to_string_pretty(&v)?), + None => eprintln!("no validate ref found for step={step} check={check}"), + } + Ok(()) +} + +pub fn list(repo: &Path) -> anyhow::Result<()> { + let refs = BritRefManager::new(repo)?; + for name in refs.list_validate_refs(None)? { + println!("{name}"); + } + Ok(()) +} diff --git a/brit-cli/Cargo.toml b/brit-cli/Cargo.toml new file mode 100644 index 00000000000..ecf182e0a4d --- /dev/null +++ b/brit-cli/Cargo.toml @@ -0,0 +1,29 @@ +lints.workspace = true + +[package] +name = "brit-cli" +version = "0.0.0" +description = "Rakia operator CLI — graph, affected, plan, fingerprint, baseline subcommands (will eventually fold into the unified brit binary)" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[[bin]] +name = "rakia" +path = "src/main.rs" + +[dependencies] +brit-epr = { path = "../brit-epr", default-features = false } +brit-graph = { path = "../brit-graph", features = ["repo"] } +rakia-core = { path = "../../rakia/rakia-core" } +rakia-brit = { path = "../../rakia/rakia-brit" } + +clap = { version = "4", features = ["derive"] } +gix = { version = "0.81", default-features = false, features = ["basic", "revision", "blob-diff", "sha1"] } +petgraph = "0.7" +serde = { version = "1", features = ["derive"] } +serde_json = "1" +thiserror = "2" +anyhow = "1" diff --git a/brit-cli/src/commands/affected.rs b/brit-cli/src/commands/affected.rs new file mode 100644 index 00000000000..17d5cbb6296 --- /dev/null +++ b/brit-cli/src/commands/affected.rs @@ -0,0 +1,61 @@ +//! brit affected — which steps are affected by changes, with provenance. + +use std::path::Path; + +use serde::Serialize; + +use crate::error::{CliError, Result}; + +#[derive(Serialize)] +struct AffectedOutput { + changed_paths: Vec, + affected: Vec, +} + +#[derive(Serialize)] +struct AffectedStep { + qualified_name: String, + affected_by: Vec, +} + +pub fn run(repo: &Path, files: Option<&str>, since: Option<&str>) -> Result<()> { + let repo = repo.canonicalize().map_err(|source| CliError::RepoNotFound { + path: repo.display().to_string(), + source, + })?; + + let changed_paths: Vec = if let Some(files) = files { + files + .split(',') + .map(|s| s.trim().to_string()) + .filter(|s| !s.is_empty()) + .collect() + } else if let Some(since) = since { + rakia_brit::changes::changed_paths_since(&repo, since, "HEAD") + .map_err(|e| CliError::ChangeDetection(format!("{e}")))? + } else { + return Err(CliError::Args("need --files or --since".into())); + }; + + let manifests = rakia_core::discover::discover_manifests(&repo) + .map_err(|e| CliError::ManifestDiscovery(format!("{e}")))?; + let constellation = rakia_core::constellation::build_constellation(&manifests)?; + let plan = rakia_core::constellation::plan_from_changes(&constellation, &changed_paths)?; + + // Flatten plan levels into a single affected list — `affected` doesn't care about ordering + let mut affected: Vec = Vec::new(); + for level in &plan.levels { + for (step, reasons) in level { + affected.push(AffectedStep { + qualified_name: step.qualified_name.clone(), + affected_by: reasons.clone(), + }); + } + } + + crate::output::print_json(&AffectedOutput { + changed_paths, + affected, + })?; + Ok(()) +} diff --git a/brit-cli/src/commands/baseline.rs b/brit-cli/src/commands/baseline.rs new file mode 100644 index 00000000000..5f67b423b13 --- /dev/null +++ b/brit-cli/src/commands/baseline.rs @@ -0,0 +1,76 @@ +//! brit baseline — read, write, and migrate baseline refs. + +use std::path::Path; + +use serde::Serialize; + +use crate::error::{CliError, Result}; + +#[derive(Serialize)] +struct BaselineRead { + pipeline: String, + r#ref: String, + commit: Option, +} + +#[derive(Serialize)] +struct BaselineWrite { + pipeline: String, + r#ref: String, + commit: String, + written: bool, +} + +#[derive(Serialize)] +struct BaselineMigrate { + source: String, + migrated: Vec, + count: usize, +} + +pub fn read(repo: &Path, pipeline: &str) -> Result<()> { + let repo = repo.canonicalize().map_err(|source| CliError::RepoNotFound { + path: repo.display().to_string(), + source, + })?; + let commit = rakia_brit::baselines::read_baseline(&repo, pipeline) + .map_err(|e| CliError::Baseline(format!("{e}")))?; + crate::output::print_json(&BaselineRead { + pipeline: pipeline.to_string(), + r#ref: format!("refs/notes/rakia/baselines/{pipeline}"), + commit, + })?; + Ok(()) +} + +pub fn write(repo: &Path, pipeline: &str, commit: &str) -> Result<()> { + let repo = repo.canonicalize().map_err(|source| CliError::RepoNotFound { + path: repo.display().to_string(), + source, + })?; + rakia_brit::baselines::write_baseline(&repo, pipeline, commit) + .map_err(|e| CliError::Baseline(format!("{e}")))?; + crate::output::print_json(&BaselineWrite { + pipeline: pipeline.to_string(), + r#ref: format!("refs/notes/rakia/baselines/{pipeline}"), + commit: commit.to_string(), + written: true, + })?; + Ok(()) +} + +pub fn migrate(repo: &Path, json_path: &Path) -> Result<()> { + let repo = repo.canonicalize().map_err(|source| CliError::RepoNotFound { + path: repo.display().to_string(), + source, + })?; + let migrated = rakia_brit::baselines::migrate_baselines(&repo, json_path) + .map_err(|e| CliError::Baseline(format!("{e}")))?; + let count = migrated.len(); + crate::output::print_json(&BaselineMigrate { + source: json_path.display().to_string(), + migrated, + count, + })?; + Ok(()) +} diff --git a/brit-cli/src/commands/fingerprint.rs b/brit-cli/src/commands/fingerprint.rs new file mode 100644 index 00000000000..ded2da55801 --- /dev/null +++ b/brit-cli/src/commands/fingerprint.rs @@ -0,0 +1,77 @@ +//! brit fingerprint — content-addressed hash of step inputs. +//! +//! Resolves each step's source + buildProcess globs against the git tree at +//! the given commit (default HEAD), reads matching blob contents, and computes +//! a deterministic ContentFingerprint per step. + +use std::path::Path; + +use serde::Serialize; + +use crate::error::{CliError, Result}; + +#[derive(Serialize)] +struct FingerprintOutput { + manifest: String, + commit: String, + fingerprints: Vec, +} + +#[derive(Serialize)] +struct StepFingerprint { + pipeline: String, + step: String, + fingerprint: String, + input_count: usize, +} + +pub fn run(manifest_path: &Path, step_filter: Option<&str>, commit_ref: &str) -> Result<()> { + let text = std::fs::read_to_string(manifest_path)?; + let m: rakia_core::manifest::BuildManifest = serde_json::from_str(&text)?; + + // Discover the repo from the manifest's parent dir + let manifest_dir = manifest_path + .parent() + .ok_or_else(|| CliError::Args(format!("manifest has no parent dir: {}", manifest_path.display())))?; + let repo = gix::discover(manifest_dir).map_err(|e| { + CliError::Args(format!("repo discovery failed for {}: {e}", manifest_dir.display())) + })?; + + // Resolve the commit ref to an ObjectId + let commit_id = repo + .rev_parse_single(commit_ref) + .map_err(|e| CliError::Args(format!("could not resolve commit '{commit_ref}': {e}")))? + .detach(); + + let mut out = Vec::new(); + for (name, step) in &m.steps { + if let Some(filter) = step_filter { + if name != filter { + continue; + } + } + let mut all_patterns: Vec = step.inputs.sources.clone(); + all_patterns.extend(step.inputs.build_process.iter().cloned()); + + let fp = brit_graph::fingerprint::ContentFingerprint::from_repo_globs( + &repo, + commit_id, + &all_patterns, + ) + .map_err(|e| CliError::Args(format!("fingerprint compute failed for step '{name}': {e}")))?; + + out.push(StepFingerprint { + pipeline: m.pipeline.clone(), + step: name.clone(), + fingerprint: fp.cid.as_str().to_string(), + input_count: fp.inputs.len(), + }); + } + + crate::output::print_json(&FingerprintOutput { + manifest: manifest_path.display().to_string(), + commit: commit_id.to_hex().to_string(), + fingerprints: out, + })?; + Ok(()) +} diff --git a/brit-cli/src/commands/graph_discover.rs b/brit-cli/src/commands/graph_discover.rs new file mode 100644 index 00000000000..1c4e90230df --- /dev/null +++ b/brit-cli/src/commands/graph_discover.rs @@ -0,0 +1,53 @@ +//! brit graph discover — list all build-manifest.json files + summary JSON. + +use std::path::Path; + +use serde::Serialize; + +use crate::error::{CliError, Result}; + +#[derive(Serialize)] +struct DiscoverOutput { + manifests: Vec, +} + +#[derive(Serialize)] +struct ManifestSummary { + path: String, + pipeline: String, + description: String, + step_count: usize, + steps: Vec, +} + +pub fn run(repo: &Path) -> Result<()> { + let repo = repo.canonicalize().map_err(|source| CliError::RepoNotFound { + path: repo.display().to_string(), + source, + })?; + + let manifests = rakia_core::discover::discover_manifests(&repo) + .map_err(|e| CliError::ManifestDiscovery(format!("{e}")))?; + + let summaries: Vec = manifests + .into_iter() + .map(|(path, m)| { + let mut steps: Vec = m.steps.keys().cloned().collect(); + steps.sort(); + ManifestSummary { + path: path + .strip_prefix(&repo) + .unwrap_or(&path) + .display() + .to_string(), + pipeline: m.pipeline, + description: m.description, + step_count: steps.len(), + steps, + } + }) + .collect(); + + crate::output::print_json(&DiscoverOutput { manifests: summaries })?; + Ok(()) +} diff --git a/brit-cli/src/commands/graph_show.rs b/brit-cli/src/commands/graph_show.rs new file mode 100644 index 00000000000..0a82edb8be9 --- /dev/null +++ b/brit-cli/src/commands/graph_show.rs @@ -0,0 +1,90 @@ +//! brit graph show — emit the build constellation as JSON or Graphviz DOT. + +use std::path::Path; + +use petgraph::dot::{Config, Dot}; +use petgraph::graph::DiGraph; +use serde::Serialize; + +use crate::error::{CliError, Result}; + +#[derive(Serialize)] +struct GraphJson { + nodes: Vec, + edges: Vec, +} + +#[derive(Serialize)] +struct NodeJson { + qualified_name: String, + pipeline: String, + name: String, + sources: Vec, + artifacts: Vec, +} + +#[derive(Serialize)] +struct EdgeJson { + from: String, + to: String, +} + +pub fn run(repo: &Path, format: &str) -> Result<()> { + let repo = repo.canonicalize().map_err(|source| CliError::RepoNotFound { + path: repo.display().to_string(), + source, + })?; + + let manifests = rakia_core::discover::discover_manifests(&repo) + .map_err(|e| CliError::ManifestDiscovery(format!("{e}")))?; + let constellation = rakia_core::constellation::build_constellation(&manifests)?; + + match format { + "json" => { + let nodes: Vec = constellation + .steps + .values() + .map(|s| NodeJson { + qualified_name: s.qualified_name.clone(), + pipeline: s.pipeline.clone(), + name: s.step_name.clone(), + sources: s.source_patterns.clone(), + artifacts: s.artifacts.clone(), + }) + .collect(); + let mut edges = Vec::new(); + for s in constellation.steps.values() { + for dep in &s.resolved_depends { + edges.push(EdgeJson { + from: dep.clone(), + to: s.qualified_name.clone(), + }); + } + } + crate::output::print_json(&GraphJson { nodes, edges })?; + } + "dot" => { + let mut g: DiGraph = DiGraph::new(); + let mut node_indices = std::collections::HashMap::new(); + for s in constellation.steps.values() { + let idx = g.add_node(s.qualified_name.clone()); + node_indices.insert(s.qualified_name.clone(), idx); + } + for s in constellation.steps.values() { + let to = node_indices[&s.qualified_name]; + for dep in &s.resolved_depends { + if let Some(&from) = node_indices.get(dep) { + g.add_edge(from, to, ()); + } + } + } + let dot = Dot::with_config(&g, &[Config::EdgeNoLabel]); + println!("{dot:?}"); + } + other => { + return Err(CliError::Args(format!("unknown format: {other}"))); + } + } + + Ok(()) +} diff --git a/brit-cli/src/commands/mod.rs b/brit-cli/src/commands/mod.rs new file mode 100644 index 00000000000..246f42fabbe --- /dev/null +++ b/brit-cli/src/commands/mod.rs @@ -0,0 +1,8 @@ +//! brit CLI subcommands. + +pub mod graph_discover; +pub mod graph_show; +pub mod affected; +pub mod plan; +pub mod fingerprint; +pub mod baseline; diff --git a/brit-cli/src/commands/plan.rs b/brit-cli/src/commands/plan.rs new file mode 100644 index 00000000000..162fd985d3a --- /dev/null +++ b/brit-cli/src/commands/plan.rs @@ -0,0 +1,114 @@ +//! brit plan — topologically grouped build plan, conforming to build-plan.schema.json. + +use std::collections::BTreeMap; +use std::path::Path; + +use crate::error::{CliError, Result}; + +const TOOL_VERSION: &str = env!("CARGO_PKG_VERSION"); + +pub fn run( + repo: &Path, + files: Option<&str>, + since: Option<&str>, + pipeline: Option<&str>, +) -> Result<()> { + let repo = repo.canonicalize().map_err(|source| CliError::RepoNotFound { + path: repo.display().to_string(), + source, + })?; + + let (changed_paths, baseline_ref, baseline_commit, head_commit) = if let Some(files) = files { + let paths: Vec = files + .split(',') + .map(|s| s.trim().to_string()) + .filter(|s| !s.is_empty()) + .collect(); + // For --files mode, baseline + head are not git-derived; use placeholders + // (40 zeros for both — schema accepts them as long as they're 40 hex chars) + (paths, "(none)".to_string(), "0".repeat(40), "0".repeat(40)) + } else if let Some(since) = since { + let head_commit_sha = rakia_brit::changes::head_commit(&repo) + .map_err(|e| CliError::ChangeDetection(format!("{e}")))?; + let baseline_commit_sha = rakia_brit::changes::resolve_ref(&repo, since) + .map_err(|e| CliError::ChangeDetection(format!("{e}")))?; + let paths = rakia_brit::changes::changed_paths_since(&repo, since, "HEAD") + .map_err(|e| CliError::ChangeDetection(format!("{e}")))?; + let ref_name = if let Some(p) = pipeline { + format!("refs/notes/rakia/baselines/{p}") + } else { + since.to_string() + }; + (paths, ref_name, baseline_commit_sha, head_commit_sha) + } else { + return Err(CliError::Args("need --files or --since".into())); + }; + + let manifests = rakia_core::discover::discover_manifests(&repo) + .map_err(|e| CliError::ManifestDiscovery(format!("{e}")))?; + let constellation = rakia_core::constellation::build_constellation(&manifests)?; + let plan = rakia_core::constellation::plan_from_changes(&constellation, &changed_paths)?; + + // Compute content-addressed fingerprints for each step in the plan. + // Skipped when --files mode (head_commit is placeholder zeros). + let fingerprints = compute_fingerprints(&repo, &head_commit, &plan)?; + + let bp = rakia_core::build_plan::to_build_plan( + &plan, + &baseline_ref, + &baseline_commit, + &head_commit, + &changed_paths, + TOOL_VERSION, + &fingerprints, + ); + + crate::output::print_json(&bp)?; + Ok(()) +} + +/// Compute the ContentFingerprint for each step in the plan, keyed by +/// qualified_name. Uses the head commit as the tree to read from. +/// +/// For --files mode (head_commit is `"0" * 40` placeholder), skip +/// fingerprinting and return an empty map. PlannedStep.fingerprint will +/// then be `""` (the no-repo-context sentinel). +fn compute_fingerprints( + repo_path: &Path, + head_commit_hex: &str, + plan: &rakia_core::constellation::TopoPlan, +) -> Result> { + let mut out = BTreeMap::new(); + + if head_commit_hex.chars().all(|c| c == '0') { + return Ok(out); + } + + let repo = gix::discover(repo_path) + .map_err(|e| CliError::Args(format!("repo open failed: {e}")))?; + let commit_id = gix::ObjectId::from_hex(head_commit_hex.as_bytes()) + .map_err(|e| CliError::Args(format!("invalid commit hex '{head_commit_hex}': {e}")))?; + + for level in &plan.levels { + for (step, _reasons) in level { + let mut all_patterns: Vec = step.source_patterns.clone(); + all_patterns.extend(step.build_process.iter().cloned()); + + let fp = brit_graph::fingerprint::ContentFingerprint::from_repo_globs( + &repo, + commit_id, + &all_patterns, + ) + .map_err(|e| { + CliError::Args(format!( + "fingerprint failed for step '{}': {e}", + step.qualified_name + )) + })?; + + out.insert(step.qualified_name.clone(), fp.cid.as_str().to_string()); + } + } + + Ok(out) +} diff --git a/brit-cli/src/error.rs b/brit-cli/src/error.rs new file mode 100644 index 00000000000..b0162865c63 --- /dev/null +++ b/brit-cli/src/error.rs @@ -0,0 +1,49 @@ +//! Error types and exit code mapping for the brit CLI. + +use thiserror::Error; + +#[derive(Error, Debug)] +pub enum CliError { + #[error("repo not found at {path}: {source}")] + RepoNotFound { + path: String, + #[source] + source: std::io::Error, + }, + + #[error("manifest discovery failed: {0}")] + ManifestDiscovery(String), + + #[error("constellation construction failed: {0}")] + Constellation(#[from] rakia_core::constellation::ConstellationError), + + #[error("change detection failed: {0}")] + ChangeDetection(String), + + #[error("baseline operation failed: {0}")] + Baseline(String), + + #[error("invalid arguments: {0}")] + Args(String), + + #[error("io error: {0}")] + Io(#[from] std::io::Error), + + #[error("json error: {0}")] + Json(#[from] serde_json::Error), +} + +impl CliError { + /// Map error variants to exit codes. + /// 0 — success (not used here) + /// 1 — generic failure + /// 2 — argument/usage error + pub fn exit_code(&self) -> i32 { + match self { + CliError::Args(_) => 2, + _ => 1, + } + } +} + +pub type Result = std::result::Result; diff --git a/brit-cli/src/main.rs b/brit-cli/src/main.rs new file mode 100644 index 00000000000..3e6676f072b --- /dev/null +++ b/brit-cli/src/main.rs @@ -0,0 +1,140 @@ +//! rakia CLI — operator surface for the build/CI orchestrator. +//! +//! Composes brit (git/EPR primitives) + REA (economic primitives) into +//! build-domain semantics. The `brit` binary itself is the daily-driver +//! git client (gitoxide-derived); this `rakia` binary is the build app +//! that consumes brit's primitives. + +use std::path::PathBuf; +use std::process::ExitCode; + +use clap::{Parser, Subcommand}; + +mod commands; +mod error; +mod output; + +use error::Result; + +#[derive(Parser)] +#[command(name = "brit", version, about = "Brit — covenant on git, EPR-native CLI")] +struct Cli { + #[command(subcommand)] + command: Command, +} + +#[derive(Subcommand)] +enum Command { + /// Graph operations on the build constellation + #[command(subcommand)] + Graph(GraphCmd), + /// Show which steps are affected by changes + Affected(AffectedArgs), + /// Compute a topologically-grouped build plan + Plan(PlanArgs), + /// Compute the content fingerprint of a step's inputs + Fingerprint(FingerprintArgs), + /// Manage rakia baseline refs + #[command(subcommand)] + Baseline(BaselineCmd), +} + +#[derive(Subcommand)] +enum GraphCmd { + /// Discover and list all build manifests + Discover { + #[arg(long, default_value = ".")] + repo: PathBuf, + }, + /// Show the full constellation graph + Show { + #[arg(long, default_value = ".")] + repo: PathBuf, + #[arg(long, default_value = "json", value_parser = ["json", "dot"])] + format: String, + }, +} + +#[derive(clap::Args)] +struct AffectedArgs { + #[arg(long, default_value = ".")] + repo: PathBuf, + /// Comma-separated list of changed files (workspace-relative) + #[arg(long, conflicts_with = "since", required_unless_present = "since")] + files: Option, + /// Compute affected from changes since the given git ref (e.g. baseline) + #[arg(long)] + since: Option, +} + +#[derive(clap::Args)] +struct PlanArgs { + #[arg(long, default_value = ".")] + repo: PathBuf, + #[arg(long, conflicts_with = "since", required_unless_present = "since")] + files: Option, + #[arg(long)] + since: Option, + /// Pipeline name (used to locate baseline ref when --since is auto) + #[arg(long)] + pipeline: Option, +} + +#[derive(clap::Args)] +struct FingerprintArgs { + /// Path to a build-manifest.json + manifest: PathBuf, + /// Specific step name (default: all steps in the manifest) + #[arg(long)] + step: Option, + /// Git ref or SHA to fingerprint against (default: HEAD) + #[arg(long, default_value = "HEAD")] + commit: String, +} + +#[derive(Subcommand)] +enum BaselineCmd { + /// Read the current baseline ref for a pipeline + Read { + pipeline: String, + #[arg(long, default_value = ".")] + repo: PathBuf, + }, + /// Write a baseline ref for a pipeline + Write { + pipeline: String, + commit: String, + #[arg(long, default_value = ".")] + repo: PathBuf, + }, + /// One-shot migration from Jenkins pipeline-baselines.json + Migrate { + json_path: PathBuf, + #[arg(long, default_value = ".")] + repo: PathBuf, + }, +} + +fn run() -> Result<()> { + let cli = Cli::parse(); + match cli.command { + Command::Graph(GraphCmd::Discover { repo }) => commands::graph_discover::run(&repo), + Command::Graph(GraphCmd::Show { repo, format }) => commands::graph_show::run(&repo, &format), + Command::Affected(args) => commands::affected::run(&args.repo, args.files.as_deref(), args.since.as_deref()), + Command::Plan(args) => commands::plan::run(&args.repo, args.files.as_deref(), args.since.as_deref(), args.pipeline.as_deref()), + Command::Fingerprint(args) => commands::fingerprint::run(&args.manifest, args.step.as_deref(), &args.commit), + Command::Baseline(BaselineCmd::Read { pipeline, repo }) => commands::baseline::read(&repo, &pipeline), + Command::Baseline(BaselineCmd::Write { pipeline, commit, repo }) => commands::baseline::write(&repo, &pipeline, &commit), + Command::Baseline(BaselineCmd::Migrate { json_path, repo }) => commands::baseline::migrate(&repo, &json_path), + } +} + +fn main() -> ExitCode { + match run() { + Ok(()) => ExitCode::SUCCESS, + Err(e) => { + eprintln!("error: {e}"); + ExitCode::from(e.exit_code() as u8) + } + } +} diff --git a/brit-cli/src/output.rs b/brit-cli/src/output.rs new file mode 100644 index 00000000000..66715b561e7 --- /dev/null +++ b/brit-cli/src/output.rs @@ -0,0 +1,9 @@ +//! Output helpers — pretty JSON to stdout, errors to stderr. + +use serde::Serialize; + +pub fn print_json(value: &T) -> Result<(), serde_json::Error> { + let s = serde_json::to_string_pretty(value)?; + println!("{s}"); + Ok(()) +} diff --git a/brit-cli/tests/cli_smoke.rs b/brit-cli/tests/cli_smoke.rs new file mode 100644 index 00000000000..0a657788679 --- /dev/null +++ b/brit-cli/tests/cli_smoke.rs @@ -0,0 +1,76 @@ +use std::process::Command; + +fn rakia_binary() -> std::path::PathBuf { + let manifest_dir = std::path::PathBuf::from(env!("CARGO_MANIFEST_DIR")); + // workspace target/debug/rakia (brit-cli is a workspace member, binary renamed from `brit`) + manifest_dir.join("../target/debug/rakia") +} + +#[test] +fn graph_discover_outputs_json_with_manifests() { + // Use the actual repo root (three levels up from brit-cli) + let repo_root = std::path::PathBuf::from(env!("CARGO_MANIFEST_DIR")) + .join("../../../").canonicalize().unwrap(); + + let out = Command::new(rakia_binary()) + .args(["graph", "discover", "--repo"]) + .arg(&repo_root) + .output() + .expect("invoke rakia"); + + assert!(out.status.success(), + "exit {} stderr: {}", + out.status, + String::from_utf8_lossy(&out.stderr)); + let stdout = String::from_utf8(out.stdout).expect("utf8 stdout"); + let v: serde_json::Value = serde_json::from_str(&stdout).expect("parse json"); + assert!(v.get("manifests").is_some(), "expected 'manifests' key in output"); + + let manifests = v["manifests"].as_array().expect("manifests is array"); + assert!(manifests.len() >= 8, + "expected at least 8 manifests, got {}", manifests.len()); +} + +#[test] +fn fingerprint_emits_content_addressed_hex_for_real_manifest() { + let repo_root = std::path::PathBuf::from(env!("CARGO_MANIFEST_DIR")) + .join("../../../").canonicalize().unwrap(); + + let manifest = repo_root.join("app/elohim-app/build-manifest.json"); + if !manifest.exists() { + // Skip if running outside the elohim repo + return; + } + + let out = std::process::Command::new(rakia_binary()) + .args(["fingerprint"]) + .arg(&manifest) + .args(["--step", "build-angular"]) + .output() + .expect("invoke rakia"); + + assert!( + out.status.success(), + "exit {} stderr: {}", + out.status, + String::from_utf8_lossy(&out.stderr) + ); + + let stdout = String::from_utf8(out.stdout).expect("utf8"); + let v: serde_json::Value = serde_json::from_str(&stdout).expect("parse json"); + let fps = v["fingerprints"].as_array().expect("fingerprints array"); + assert_eq!(fps.len(), 1, "filtered to one step"); + + let fp = &fps[0]; + assert_eq!(fp["step"], "build-angular"); + let hex = fp["fingerprint"].as_str().expect("fingerprint string"); + assert_eq!(hex.len(), 64, "blake3 hex is 64 chars"); + assert!(hex.chars().all(|c| c.is_ascii_hexdigit()), "hex"); + let input_count = fp["input_count"].as_u64().expect("input_count"); + assert!(input_count > 0, "build-angular should match real source files"); + + // Verify the new `commit` field is also a 40-char hex SHA + let commit = v["commit"].as_str().expect("commit string"); + assert_eq!(commit.len(), 40, "git SHA-1 is 40 hex chars"); + assert!(commit.chars().all(|c| c.is_ascii_hexdigit()), "hex"); +} diff --git a/brit-epr/Cargo.toml b/brit-epr/Cargo.toml new file mode 100644 index 00000000000..e4b5453e7c1 --- /dev/null +++ b/brit-epr/Cargo.toml @@ -0,0 +1,37 @@ +lints.workspace = true + +[package] +name = "brit-epr" +version = "0.0.0" +description = "Elohim Protocol primitives (pillar trailers, dispatch trait, validation) for brit — an expansion of gitoxide with covenant semantics" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[lib] +doctest = false + +[features] +default = ["elohim-protocol"] +# Gates the elohim module — brit's first-party app schema implementation. +# With this feature off, brit-epr is the covenant engine alone: trailer +# parsing, the AppSchema dispatch trait, error types. No pillar-specific +# behavior. A downstream fork can disable this feature and ship their own +# app schema crate. +elohim-protocol = [] + +[dependencies] +gix-object = { version = "^0.58.0", path = "../gix-object", features = ["sha1"] } +thiserror = "2.0" +serde = { version = "1", features = ["derive"] } +serde_json = "1" +blake3 = "1" +ed25519-dalek = { version = "2", features = ["rand_core"] } +rand = "0.8" +chrono = { version = "0.4", features = ["serde"], default-features = false } +hex = "0.4" + +[dev-dependencies] +tempfile = "3" diff --git a/brit-epr/src/elohim/attestation/build.rs b/brit-epr/src/elohim/attestation/build.rs new file mode 100644 index 00000000000..0f068333dac --- /dev/null +++ b/brit-epr/src/elohim/attestation/build.rs @@ -0,0 +1,46 @@ +use serde::{Deserialize, Serialize}; +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; +use crate::engine::signing::Signed; + +/// Records that an agent produced an output artifact from a manifest's inputs. +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct BuildAttestationContentNode { + /// CID of the build manifest that was executed. + pub manifest_cid: BritCid, + /// Pipeline step name (e.g. `elohim-edge:cargo-build-storage`). + pub step_name: String, + /// Hash of all input files consumed by this build step. + pub inputs_hash: String, + /// CID of the artifact produced by this step. + pub output_cid: BritCid, + /// Agent identifier (public key or agent DID). + pub agent_id: String, + /// Hardware profile of the build machine (arch, OS, memory, etc.). + pub hardware_profile: serde_json::Value, + /// Wall-clock duration of the build in milliseconds. + pub build_duration_ms: u64, + /// ISO-8601 timestamp when the build completed. + pub built_at: String, + /// Whether the build step succeeded. + pub success: bool, + /// Agent signature over the attestation payload. + pub signature: String, +} + +impl ContentNode for BuildAttestationContentNode { + fn content_type(&self) -> &'static str { + "brit.build-attestation" + } +} + +impl Signed for BuildAttestationContentNode { + fn signature(&self) -> &str { &self.signature } + fn agent_id(&self) -> &str { &self.agent_id } + fn without_signature(&self) -> Self { + let mut c = self.clone(); + c.signature = String::new(); + c + } +} diff --git a/brit-epr/src/elohim/attestation/deploy.rs b/brit-epr/src/elohim/attestation/deploy.rs new file mode 100644 index 00000000000..05297a6b6ef --- /dev/null +++ b/brit-epr/src/elohim/attestation/deploy.rs @@ -0,0 +1,61 @@ +use serde::{Deserialize, Serialize}; +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; +use crate::engine::signing::Signed; + +/// Health status of a deployed service at attestation time. +#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum HealthStatus { + /// Service is reachable and responding correctly. + Healthy, + /// Service is reachable but returning degraded responses. + Degraded, + /// Service did not respond within the health-check timeout. + Unreachable, +} + +/// Records that an agent confirms an artifact is live at an environment. +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct DeployAttestationContentNode { + /// CID of the artifact that was deployed. + pub artifact_cid: BritCid, + /// Pipeline step name (e.g. `elohim-edge:cargo-build-storage`). + pub step_name: String, + /// Human-readable label for the deployment environment (e.g. `staging`). + pub environment_label: String, + /// Base URL of the deployed service. + pub endpoint: String, + /// EPR reference for the liveness health check (e.g. `epr:{service-cid}/health`). + /// Resolves through doorway when protocol-aware; degrades to no-op on stock git forges. + pub health_check_epr: String, + /// Health status observed at attestation time. + pub health_status: HealthStatus, + /// ISO-8601 timestamp when the artifact was deployed. + pub deployed_at: String, + /// ISO-8601 timestamp when the health check was performed. + pub attested_at: String, + /// Seconds before this attestation expires and a re-check is required. + pub liveness_ttl_sec: u64, + /// Agent identifier (public key or agent DID). + pub agent_id: String, + /// Agent signature over the attestation payload. + pub signature: String, +} + +impl ContentNode for DeployAttestationContentNode { + fn content_type(&self) -> &'static str { + "brit.deploy-attestation" + } +} + +impl Signed for DeployAttestationContentNode { + fn signature(&self) -> &str { &self.signature } + fn agent_id(&self) -> &str { &self.agent_id } + fn without_signature(&self) -> Self { + let mut c = self.clone(); + c.signature = String::new(); + c + } +} diff --git a/brit-epr/src/elohim/attestation/mod.rs b/brit-epr/src/elohim/attestation/mod.rs new file mode 100644 index 00000000000..f5d95653ca5 --- /dev/null +++ b/brit-epr/src/elohim/attestation/mod.rs @@ -0,0 +1,10 @@ +//! Attestation ContentNode types for the Elohim Protocol. + +/// Build attestation — records an agent producing an output artifact. +pub mod build; +/// Deploy attestation — records an agent confirming an artifact is live. +pub mod deploy; +/// Reach computation — derives a reach level from existing attestations. +pub mod reach; +/// Validation attestation — records a named check applied to an artifact. +pub mod validation; diff --git a/brit-epr/src/elohim/attestation/reach.rs b/brit-epr/src/elohim/attestation/reach.rs new file mode 100644 index 00000000000..514f4fa627e --- /dev/null +++ b/brit-epr/src/elohim/attestation/reach.rs @@ -0,0 +1,42 @@ +//! Reach computation — derives a reach level from existing attestations. + +use serde::{Deserialize, Serialize}; + +/// Reach level derived from attestations. Unknown < Built < Deployed < Verified. +#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Serialize, Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum ReachLevel { + /// No attestations exist for this artifact. + Unknown, + /// Build attestation exists; artifact was produced successfully. + Built, + /// Build and deploy attestations exist; artifact is live. + Deployed, + /// Build, deploy, and validation attestations exist; artifact is verified. + Verified, +} + +/// Input for reach computation. +#[derive(Debug, Clone)] +pub struct ReachInput { + /// Agent IDs or step names of agents that produced build attestations. + pub build_attestations: Vec, + /// Environment labels or step names of deploy attestations. + pub deploy_attestations: Vec, + /// Check names of validation attestations. + pub validation_attestations: Vec, +} + +/// Compute reach level. Deterministic: same inputs = same output. +pub fn compute_reach(input: &ReachInput) -> ReachLevel { + let has_build = !input.build_attestations.is_empty(); + let has_deploy = !input.deploy_attestations.is_empty(); + let has_validation = !input.validation_attestations.is_empty(); + + match (has_build, has_deploy, has_validation) { + (true, true, true) => ReachLevel::Verified, + (true, true, false) => ReachLevel::Deployed, + (true, false, _) => ReachLevel::Built, + (false, _, _) => ReachLevel::Unknown, + } +} diff --git a/brit-epr/src/elohim/attestation/validation.rs b/brit-epr/src/elohim/attestation/validation.rs new file mode 100644 index 00000000000..ccce2fb2b24 --- /dev/null +++ b/brit-epr/src/elohim/attestation/validation.rs @@ -0,0 +1,60 @@ +use serde::{Deserialize, Serialize}; +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; +use crate::engine::signing::Signed; + +/// Outcome of a named validation check. +#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum ValidationResult { + /// All criteria satisfied — no issues found. + Pass, + /// One or more blocking issues found. + Fail, + /// Non-blocking issues found; manual review recommended. + Warn, + /// Check was skipped (e.g. not applicable to this artifact). + Skip, +} + +/// Records that a validator applied a named check to an artifact. +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ValidationAttestationContentNode { + /// CID of the artifact under validation. + pub artifact_cid: BritCid, + /// Name and version of the check (e.g. `sonarqube-scan@v10`). + pub check_name: String, + /// Identifier of the agent that ran the check. + pub validator_id: String, + /// Version of the validator tool. + pub validator_version: String, + /// Outcome of the check. + pub result: ValidationResult, + /// Human-readable summary of the findings. + pub result_summary: String, + /// Optional CID pointing to detailed findings output. + pub findings_cid: Option, + /// ISO-8601 timestamp when the check was performed. + pub validated_at: String, + /// Seconds before this attestation expires; `None` means no expiry. + pub ttl_sec: Option, + /// Agent signature over the attestation payload. + pub signature: String, +} + +impl ContentNode for ValidationAttestationContentNode { + fn content_type(&self) -> &'static str { + "brit.validation-attestation" + } +} + +impl Signed for ValidationAttestationContentNode { + fn signature(&self) -> &str { &self.signature } + fn agent_id(&self) -> &str { &self.validator_id } + fn without_signature(&self) -> Self { + let mut c = self.clone(); + c.signature = String::new(); + c + } +} diff --git a/brit-epr/src/elohim/mod.rs b/brit-epr/src/elohim/mod.rs new file mode 100644 index 00000000000..6cfa3b5f271 --- /dev/null +++ b/brit-epr/src/elohim/mod.rs @@ -0,0 +1,16 @@ +//! Elohim Protocol app schema — first-party `AppSchema` implementation. +//! +//! Gated behind `#[cfg(feature = "elohim-protocol")]`. With this feature +//! disabled, `brit-epr` ships only the engine. + +mod parse; +mod pillar_trailers; +mod schema; +mod validate; +pub mod attestation; +pub mod refs; + +pub use parse::parse_pillar_trailers; +pub use pillar_trailers::{PillarTrailers, TrailerKey}; +pub use schema::ElohimProtocolSchema; +pub use validate::{validate_pillar_trailers, PillarValidationError}; diff --git a/brit-epr/src/elohim/parse.rs b/brit-epr/src/elohim/parse.rs new file mode 100644 index 00000000000..5bc6767a0a1 --- /dev/null +++ b/brit-epr/src/elohim/parse.rs @@ -0,0 +1,39 @@ +//! `parse_pillar_trailers` — convenience function that projects a +//! `TrailerSet` into the strongly-typed `PillarTrailers` view. + +use crate::elohim::pillar_trailers::{PillarTrailers, TrailerKey}; +use crate::engine::parse_trailer_block; + +/// Parse pillar trailers from a commit body. +/// +/// Pure function: no I/O beyond reading the body slice. Unknown trailers +/// (anything outside the six reserved pillar keys) are silently skipped — +/// a commit may carry `Signed-off-by:`, `Co-Authored-By:`, etc., alongside +/// the pillar trailers. +/// +/// Permissive: malformed values in `*_Node:` trailers are accepted as raw +/// strings. Strict validation is done by `validate_pillar_trailers`. +pub fn parse_pillar_trailers(body: &[u8]) -> PillarTrailers { + let set = parse_trailer_block(body); + let mut out = PillarTrailers::default(); + + for (key, value) in set.iter() { + for pillar in TrailerKey::all() { + if key == pillar.summary_token() { + match pillar { + TrailerKey::Lamad => out.lamad = Some(value.to_string()), + TrailerKey::Shefa => out.shefa = Some(value.to_string()), + TrailerKey::Qahal => out.qahal = Some(value.to_string()), + } + } else if key == pillar.node_token() { + match pillar { + TrailerKey::Lamad => out.lamad_node = Some(value.to_string()), + TrailerKey::Shefa => out.shefa_node = Some(value.to_string()), + TrailerKey::Qahal => out.qahal_node = Some(value.to_string()), + } + } + } + } + + out +} diff --git a/brit-epr/src/elohim/pillar_trailers.rs b/brit-epr/src/elohim/pillar_trailers.rs new file mode 100644 index 00000000000..e81ed03e4a1 --- /dev/null +++ b/brit-epr/src/elohim/pillar_trailers.rs @@ -0,0 +1,72 @@ +//! Pillar trailer types — the strongly-typed view the elohim app schema +//! uses to represent the three pillars plus their linked-node CID slots. + +/// Which of the three pillars a trailer belongs to. +/// +/// The elohim protocol pillars: +/// +/// - **Lamad** (לָמַד, "to learn") — knowledge positioning. +/// - **Shefa** (שֶׁפַע, "abundance") — economic positioning. +/// - **Qahal** (קָהָל, "assembly") — governance positioning. +/// +/// Each pillar has two trailer forms: a canonical summary (e.g., `Lamad:`) +/// and a linked-node CID reference (e.g., `Lamad-Node:`). +#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)] +pub enum TrailerKey { + /// Knowledge-layer trailer. + Lamad, + /// Economic-layer trailer. + Shefa, + /// Governance-layer trailer. + Qahal, +} + +impl TrailerKey { + /// The RFC-822 token name for the canonical-summary trailer. + pub fn summary_token(self) -> &'static str { + match self { + TrailerKey::Lamad => "Lamad", + TrailerKey::Shefa => "Shefa", + TrailerKey::Qahal => "Qahal", + } + } + + /// The RFC-822 token name for the linked-node CID trailer. + pub fn node_token(self) -> &'static str { + match self { + TrailerKey::Lamad => "Lamad-Node", + TrailerKey::Shefa => "Shefa-Node", + TrailerKey::Qahal => "Qahal-Node", + } + } + + /// All three pillars, in canonical order. + pub fn all() -> [TrailerKey; 3] { + [TrailerKey::Lamad, TrailerKey::Shefa, TrailerKey::Qahal] + } +} + +/// Pillar trailers extracted from a commit body and projected into the +/// typed view the elohim app schema uses. +/// +/// Each `*_node` field holds the raw CID *string* — Phase 1 does not parse +/// the string into a typed `Cid`, does not resolve it, and does not check +/// the target's type. The parser is permissive; strict CID validation and +/// resolution arrive in Phase 2. +#[derive(Debug, Clone, Default, PartialEq, Eq)] +pub struct PillarTrailers { + /// Canonical summary value of the `Lamad:` trailer, trimmed. + pub lamad: Option, + /// Canonical summary value of the `Shefa:` trailer, trimmed. + pub shefa: Option, + /// Canonical summary value of the `Qahal:` trailer, trimmed. + pub qahal: Option, + + /// Raw CID string from a `Lamad-Node:` trailer, if present. Phase 1 + /// does not parse or resolve this. + pub lamad_node: Option, + /// Raw CID string from a `Shefa-Node:` trailer, if present. + pub shefa_node: Option, + /// Raw CID string from a `Qahal-Node:` trailer, if present. + pub qahal_node: Option, +} diff --git a/brit-epr/src/elohim/refs.rs b/brit-epr/src/elohim/refs.rs new file mode 100644 index 00000000000..23ad3b0f04c --- /dev/null +++ b/brit-epr/src/elohim/refs.rs @@ -0,0 +1,220 @@ +//! `BritRefManager` — read/write/list git refs under `refs/notes/brit/`. + +use std::path::{Path, PathBuf}; +use std::process::Command; + +/// Manages git refs under `refs/notes/brit/` for attestation indexing. +pub struct BritRefManager { + repo_path: PathBuf, +} + +impl BritRefManager { + /// Create a new `BritRefManager` for the given repository root. + /// + /// Returns [`RefError::NotARepo`] if `repo_path` is not a git repository. + pub fn new(repo_path: &Path) -> Result { + if !repo_path.join(".git").exists() && !repo_path.join("HEAD").exists() { + return Err(RefError::NotARepo(repo_path.display().to_string())); + } + Ok(Self { repo_path: repo_path.to_path_buf() }) + } + + // --- Build refs (per-commit via git notes) --- + + /// Write a build attestation payload as a git note on `commit_rev`. + pub fn put_build_ref(&self, step_name: &str, commit_rev: &str, payload: &serde_json::Value) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/build/{}", Self::encode_segment(step_name)); + let commit_sha = self.resolve_rev(commit_rev)?; + self.write_note(&ref_name, &commit_sha, payload) + } + + /// Read the build attestation payload for `commit_rev`, or `None` if absent. + pub fn get_build_ref(&self, step_name: &str, commit_rev: &str) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/build/{}", Self::encode_segment(step_name)); + let commit_sha = self.resolve_rev(commit_rev)?; + self.read_note(&ref_name, &commit_sha) + } + + /// List all build ref step names under `refs/notes/brit/build/`. + pub fn list_build_refs(&self, _pattern: Option<&str>) -> Result, RefError> { + self.list_refs_under("refs/notes/brit/build/") + } + + // --- Deploy refs (standalone blob refs) --- + + /// Write a deploy attestation payload as a standalone blob ref. + pub fn put_deploy_ref(&self, step_name: &str, env: &str, payload: &serde_json::Value) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/deploy/{}/{}", Self::encode_segment(step_name), Self::encode_segment(env)); + self.write_ref_blob(&ref_name, payload) + } + + /// Read the deploy attestation payload, or `None` if absent. + pub fn get_deploy_ref(&self, step_name: &str, env: &str) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/deploy/{}/{}", Self::encode_segment(step_name), Self::encode_segment(env)); + self.read_ref_blob(&ref_name) + } + + /// List all deploy ref names under `refs/notes/brit/deploy/`. + pub fn list_deploy_refs(&self, _pattern: Option<&str>) -> Result, RefError> { + self.list_refs_under("refs/notes/brit/deploy/") + } + + // --- Validate refs --- + + /// Write a validation attestation payload as a standalone blob ref. + pub fn put_validate_ref(&self, step_name: &str, check_name: &str, payload: &serde_json::Value) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/validate/{}/{}", Self::encode_segment(step_name), Self::encode_segment(check_name)); + self.write_ref_blob(&ref_name, payload) + } + + /// Read the validation attestation payload, or `None` if absent. + pub fn get_validate_ref(&self, step_name: &str, check_name: &str) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/validate/{}/{}", Self::encode_segment(step_name), Self::encode_segment(check_name)); + self.read_ref_blob(&ref_name) + } + + /// List all validate ref names under `refs/notes/brit/validate/`. + pub fn list_validate_refs(&self, _pattern: Option<&str>) -> Result, RefError> { + self.list_refs_under("refs/notes/brit/validate/") + } + + // --- Reach refs --- + + /// Write a reach payload as a standalone blob ref. + pub fn put_reach_ref(&self, step_name: &str, payload: &serde_json::Value) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/reach/{}", Self::encode_segment(step_name)); + self.write_ref_blob(&ref_name, payload) + } + + /// Read the reach payload, or `None` if absent. + pub fn get_reach_ref(&self, step_name: &str) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/reach/{}", Self::encode_segment(step_name)); + self.read_ref_blob(&ref_name) + } + + // --- Internal helpers --- + + /// Encode a ref path segment, replacing characters git rejects in ref names. + /// + /// Per `git check-ref-format`, ref names cannot contain: `:`, `@{`, `?`, + /// `*`, `[`, `\`, `^`, `~`, space, DEL, or control characters. They also + /// cannot contain `..` or end with `.lock`. We percent-encode a superset + /// of forbidden characters to be safe. + fn encode_segment(s: &str) -> String { + let mut out = String::with_capacity(s.len()); + for c in s.chars() { + match c { + ':' => out.push_str("%3A"), + '@' => out.push_str("%40"), + '?' => out.push_str("%3F"), + '*' => out.push_str("%2A"), + '[' => out.push_str("%5B"), + '\\' => out.push_str("%5C"), + '^' => out.push_str("%5E"), + '~' => out.push_str("%7E"), + ' ' => out.push_str("%20"), + c if c.is_ascii_control() => { + use std::fmt::Write; + let _ = write!(out, "%{:02X}", c as u8); + } + _ => out.push(c), + } + } + // Also handle ".." which git forbids in ref names + out.replace("..", "%2E%2E") + } + + /// Decode a percent-encoded ref path segment back to its original form. + fn decode_segment(s: &str) -> String { + s.replace("%3A", ":") + .replace("%40", "@") + .replace("%3F", "?") + .replace("%2A", "*") + .replace("%5B", "[") + .replace("%5C", "\\") + .replace("%5E", "^") + .replace("%7E", "~") + .replace("%20", " ") + .replace("%2E%2E", "..") + } + + fn resolve_rev(&self, rev: &str) -> Result { + let output = Command::new("git").args(["rev-parse", rev]).current_dir(&self.repo_path).output().map_err(RefError::GitCommand)?; + if !output.status.success() { return Err(RefError::RevNotFound(rev.to_string())); } + Ok(String::from_utf8_lossy(&output.stdout).trim().to_string()) + } + + fn write_note(&self, ref_name: &str, commit_sha: &str, payload: &serde_json::Value) -> Result<(), RefError> { + let json = serde_json::to_string(payload).map_err(RefError::Json)?; + let output = Command::new("git").args(["notes", "--ref", ref_name, "add", "-f", "-m", &json, commit_sha]).current_dir(&self.repo_path).output().map_err(RefError::GitCommand)?; + if !output.status.success() { + return Err(RefError::GitFailed(format!("notes add to {ref_name}: {}", String::from_utf8_lossy(&output.stderr)))); + } + Ok(()) + } + + fn read_note(&self, ref_name: &str, commit_sha: &str) -> Result, RefError> { + let output = Command::new("git").args(["notes", "--ref", ref_name, "show", commit_sha]).current_dir(&self.repo_path).output().map_err(RefError::GitCommand)?; + if !output.status.success() { return Ok(None); } + let text = String::from_utf8_lossy(&output.stdout); + let value = serde_json::from_str(text.trim()).map_err(RefError::Json)?; + Ok(Some(value)) + } + + fn write_ref_blob(&self, ref_name: &str, payload: &serde_json::Value) -> Result<(), RefError> { + let json = serde_json::to_string(payload).map_err(RefError::Json)?; + let mut child = Command::new("git").args(["hash-object", "-w", "--stdin"]).current_dir(&self.repo_path) + .stdin(std::process::Stdio::piped()).stdout(std::process::Stdio::piped()).spawn().map_err(RefError::GitCommand)?; + { + use std::io::Write; + let mut stdin = child.stdin.take() + .ok_or_else(|| RefError::GitFailed("stdin pipe unavailable".into()))?; + stdin.write_all(json.as_bytes()).map_err(RefError::GitCommand)?; + } + let hash_output = child.wait_with_output().map_err(RefError::GitCommand)?; + if !hash_output.status.success() { return Err(RefError::GitFailed("hash-object failed".into())); } + let blob_sha = String::from_utf8_lossy(&hash_output.stdout).trim().to_string(); + + let update_output = Command::new("git").args(["update-ref", ref_name, &blob_sha]).current_dir(&self.repo_path).output().map_err(RefError::GitCommand)?; + if !update_output.status.success() { + return Err(RefError::GitFailed(format!("update-ref {ref_name}: {}", String::from_utf8_lossy(&update_output.stderr)))); + } + Ok(()) + } + + fn read_ref_blob(&self, ref_name: &str) -> Result, RefError> { + let output = Command::new("git").args(["cat-file", "-p", ref_name]).current_dir(&self.repo_path).output().map_err(RefError::GitCommand)?; + if !output.status.success() { return Ok(None); } + let text = String::from_utf8_lossy(&output.stdout); + let value = serde_json::from_str(text.trim()).map_err(RefError::Json)?; + Ok(Some(value)) + } + + fn list_refs_under(&self, prefix: &str) -> Result, RefError> { + let output = Command::new("git").args(["for-each-ref", "--format=%(refname)", prefix]).current_dir(&self.repo_path).output().map_err(RefError::GitCommand)?; + if !output.status.success() { return Ok(Vec::new()); } + let text = String::from_utf8_lossy(&output.stdout); + let names: Vec = text.lines().filter(|l| !l.is_empty()).filter_map(|line| line.strip_prefix(prefix)).map(Self::decode_segment).collect(); + Ok(names) + } +} + +/// Errors returned by [`BritRefManager`] operations. +#[derive(Debug, thiserror::Error)] +pub enum RefError { + /// The given path is not a git repository. + #[error("not a git repository: {0}")] + NotARepo(String), + /// A git revision could not be resolved. + #[error("rev not found: {0}")] + RevNotFound(String), + /// Failed to spawn or communicate with a git subprocess. + #[error("git command error: {0}")] + GitCommand(std::io::Error), + /// A git subprocess exited with a non-zero status. + #[error("git command failed: {0}")] + GitFailed(String), + /// JSON serialization or deserialization failed. + #[error("JSON error: {0}")] + Json(serde_json::Error), +} diff --git a/brit-epr/src/elohim/schema.rs b/brit-epr/src/elohim/schema.rs new file mode 100644 index 00000000000..9b8a8240155 --- /dev/null +++ b/brit-epr/src/elohim/schema.rs @@ -0,0 +1,61 @@ +//! `ElohimProtocolSchema` — the first-party `AppSchema` implementation. + +use crate::elohim::pillar_trailers::TrailerKey; +use crate::engine::{AppSchema, TrailerSet, ValidationError}; + +/// Zero-sized implementor of [`AppSchema`] for the Elohim Protocol. +/// +/// Instances are stateless. Typically you construct one like +/// `const SCHEMA: ElohimProtocolSchema = ElohimProtocolSchema;` and pass +/// by reference. +#[derive(Debug, Clone, Copy, Default)] +pub struct ElohimProtocolSchema; + +const SUMMARY_KEYS: &[&str] = &["Lamad", "Shefa", "Qahal"]; +const NODE_KEYS: &[&str] = &["Lamad-Node", "Shefa-Node", "Qahal-Node"]; + +impl AppSchema for ElohimProtocolSchema { + fn id(&self) -> &'static str { + "elohim-protocol/1.0.0" + } + + fn owns_key(&self, key: &str) -> bool { + SUMMARY_KEYS.contains(&key) || NODE_KEYS.contains(&key) + } + + fn required_keys(&self) -> &'static [&'static str] { + SUMMARY_KEYS + } + + fn cid_bearing_keys(&self) -> &'static [&'static str] { + NODE_KEYS + } + + fn validate_pair(&self, key: &str, value: &str) -> Result<(), ValidationError> { + if !self.owns_key(key) { + return Ok(()); // not our key; ignore + } + if value.trim().is_empty() { + return Err(ValidationError::EmptyValue(key.to_string())); + } + // Phase 1: no additional format checks. Phase 2 adds CID parsing on + // NODE_KEYS. + Ok(()) + } + + fn validate_set(&self, trailers: &TrailerSet) -> Result<(), ValidationError> { + // Check required keys are present in canonical order so the error + // always names Lamad before Shefa before Qahal. + for key in TrailerKey::all() { + let summary = key.summary_token(); + match trailers.get(summary) { + None => return Err(ValidationError::MissingKey(summary.to_string())), + Some(v) if v.trim().is_empty() => { + return Err(ValidationError::EmptyValue(summary.to_string())) + } + Some(_) => {} + } + } + Ok(()) + } +} diff --git a/brit-epr/src/elohim/validate.rs b/brit-epr/src/elohim/validate.rs new file mode 100644 index 00000000000..a12ed46833e --- /dev/null +++ b/brit-epr/src/elohim/validate.rs @@ -0,0 +1,46 @@ +//! Structural validation for pillar trailers. +//! +//! Checks that each pillar has a non-empty summary value. Does NOT resolve +//! linked-node CIDs, does NOT traverse the ContentNode graph, does NOT +//! enforce domain rules — those live in higher layers (Phase 2+). + +use thiserror::Error; + +use super::pillar_trailers::{PillarTrailers, TrailerKey}; + +/// Structural validation errors. +#[derive(Debug, Error, PartialEq, Eq)] +pub enum PillarValidationError { + /// Required pillar summary trailer is missing. + #[error("required pillar trailer missing: {0:?}")] + MissingPillar(TrailerKey), + + /// Pillar summary trailer is present but empty after trimming. + #[error("pillar trailer {0:?} is present but value is empty")] + EmptyPillar(TrailerKey), +} + +/// Structurally validate a `PillarTrailers` view. +/// +/// Returns `Ok(())` if all three summary trailers are present and non-empty. +/// Returns the first error in canonical order (Lamad → Shefa → Qahal). +/// +/// Linked-node CID strings are ignored by this validator — Phase 1 does +/// not enforce their format or resolvability. +pub fn validate_pillar_trailers(t: &PillarTrailers) -> Result<(), PillarValidationError> { + for pillar in TrailerKey::all() { + let summary = match pillar { + TrailerKey::Lamad => t.lamad.as_deref(), + TrailerKey::Shefa => t.shefa.as_deref(), + TrailerKey::Qahal => t.qahal.as_deref(), + }; + match summary { + None => return Err(PillarValidationError::MissingPillar(pillar)), + Some(v) if v.trim().is_empty() => { + return Err(PillarValidationError::EmptyPillar(pillar)) + } + Some(_) => {} + } + } + Ok(()) +} diff --git a/brit-epr/src/engine/app_schema.rs b/brit-epr/src/engine/app_schema.rs new file mode 100644 index 00000000000..b875bb4018b --- /dev/null +++ b/brit-epr/src/engine/app_schema.rs @@ -0,0 +1,37 @@ +//! `AppSchema` — the dispatch contract between the covenant engine and +//! specific app schemas (e.g., `elohim-protocol`). +//! +//! The normative specification is in `docs/schemas/elohim-protocol-manifest.md` +//! §2.3. This file is the Rust projection of that contract. + +use crate::engine::{TrailerSet, ValidationError}; + +/// Dispatch contract that app schemas implement. +/// +/// The engine consumes an `impl AppSchema` to do validation and rendering +/// without knowing the specific vocabulary (Lamad / Shefa / Qahal, or any +/// other app's keys). This is what keeps the engine/app-schema boundary +/// legible — see `elohim-protocol-manifest.md` §11.7 for boundary smells +/// that indicate the boundary is drifting. +pub trait AppSchema { + /// Stable identifier for this schema, e.g. `"elohim-protocol/1.0.0"`. + fn id(&self) -> &'static str; + + /// Does this schema recognize this trailer key? + fn owns_key(&self, key: &str) -> bool; + + /// Required keys. Engine uses this to short-circuit validation when the + /// commit message is missing the required surface entirely. + fn required_keys(&self) -> &'static [&'static str]; + + /// Which keys carry CID references? The resolver walks these in later + /// phases. Phase 1 just records the list. + fn cid_bearing_keys(&self) -> &'static [&'static str]; + + /// Validate one `(key, value)` pair in isolation (no cross-field rules). + fn validate_pair(&self, key: &str, value: &str) -> Result<(), ValidationError>; + + /// Validate the whole trailer set together (cross-field rules, e.g. + /// "`Lamad-Node:` present requires `Lamad:` non-empty"). + fn validate_set(&self, trailers: &TrailerSet) -> Result<(), ValidationError>; +} diff --git a/brit-epr/src/engine/cid.rs b/brit-epr/src/engine/cid.rs new file mode 100644 index 00000000000..509bbf00b58 --- /dev/null +++ b/brit-epr/src/engine/cid.rs @@ -0,0 +1,101 @@ +//! `BritCid` — content identifier based on BLAKE3 hashing. +//! +//! Phase 2a uses a simplified CID: the BLAKE3 hash of the canonical JSON +//! serialization of a ContentNode. Full multiformats CIDv1 comes in a later +//! phase when interop with IPFS/Holochain requires it. + +use std::fmt; +use std::str::FromStr; + +use serde::{Deserialize, Serialize}; + +/// A content identifier — the BLAKE3 hash of a content payload. +/// +/// Displayed and parsed as a 64-character lowercase hex string. +#[derive(Debug, Clone, PartialEq, Eq, Hash, Serialize, Deserialize)] +#[serde(transparent)] +pub struct BritCid(String); + +impl BritCid { + /// Compute a CID from arbitrary bytes. + pub fn compute(data: &[u8]) -> Self { + let hash = blake3::hash(data); + Self(hash.to_hex().to_string()) + } + + /// Return the hex string representation. + pub fn as_str(&self) -> &str { + &self.0 + } +} + +impl fmt::Display for BritCid { + fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { + f.write_str(&self.0) + } +} + +impl FromStr for BritCid { + type Err = CidParseError; + + fn from_str(s: &str) -> Result { + if s.len() != 64 { + return Err(CidParseError::InvalidLength(s.len())); + } + if !s.chars().all(|c| c.is_ascii_hexdigit()) { + return Err(CidParseError::InvalidHex); + } + Ok(Self(s.to_lowercase())) + } +} + +/// Errors when parsing a CID string. +#[derive(Debug, thiserror::Error, PartialEq, Eq)] +pub enum CidParseError { + /// Expected 64 hex characters. + #[error("expected 64 hex characters, got {0}")] + InvalidLength(usize), + /// Non-hex character found. + #[error("CID contains non-hex characters")] + InvalidHex, +} + +#[cfg(test)] +mod tests { + use super::*; + + #[test] + fn compute_is_deterministic() { + let a = BritCid::compute(b"hello world"); + let b = BritCid::compute(b"hello world"); + assert_eq!(a, b); + } + + #[test] + fn different_input_different_cid() { + let a = BritCid::compute(b"hello"); + let b = BritCid::compute(b"world"); + assert_ne!(a, b); + } + + #[test] + fn roundtrip_display_parse() { + let cid = BritCid::compute(b"test data"); + let parsed: BritCid = cid.to_string().parse().unwrap(); + assert_eq!(cid, parsed); + } + + #[test] + fn rejects_short_string() { + let result = "abc123".parse::(); + assert_eq!(result, Err(CidParseError::InvalidLength(6))); + } + + #[test] + fn serde_roundtrip() { + let cid = BritCid::compute(b"serde test"); + let json = serde_json::to_string(&cid).unwrap(); + let back: BritCid = serde_json::from_str(&json).unwrap(); + assert_eq!(cid, back); + } +} diff --git a/brit-epr/src/engine/content_node.rs b/brit-epr/src/engine/content_node.rs new file mode 100644 index 00000000000..089e64f43ca --- /dev/null +++ b/brit-epr/src/engine/content_node.rs @@ -0,0 +1,48 @@ +//! `ContentNode` — trait for CID-addressed content objects stored locally. + +use serde::{de::DeserializeOwned, Serialize}; +use crate::engine::cid::BritCid; + +/// Recursively sort all object keys in a JSON value for canonical representation. +fn sort_json_keys(value: serde_json::Value) -> serde_json::Value { + match value { + serde_json::Value::Object(map) => { + let sorted: serde_json::Map = map + .into_iter() + .map(|(k, v)| (k, sort_json_keys(v))) + .collect::>() + .into_iter() + .collect(); + serde_json::Value::Object(sorted) + } + serde_json::Value::Array(arr) => { + serde_json::Value::Array(arr.into_iter().map(sort_json_keys).collect()) + } + other => other, + } +} + +/// A content-addressed node that can be serialized to canonical JSON and +/// stored in the local object store. +pub trait ContentNode: Serialize + DeserializeOwned { + /// The content type discriminator, e.g. `"brit.build-attestation"`. + fn content_type(&self) -> &'static str; + + /// Serialize to canonical JSON bytes with lexicographically sorted keys. + /// + /// Keys are sorted recursively at all nesting levels. This guarantees + /// that the same logical content always produces the same byte sequence, + /// regardless of struct field declaration order or serialization library + /// internals. + fn canonical_json(&self) -> Result, serde_json::Error> { + let value = serde_json::to_value(self)?; + let sorted = sort_json_keys(value); + serde_json::to_vec(&sorted) + } + + /// Compute the content identifier from the canonical JSON. + fn compute_cid(&self) -> Result { + let bytes = self.canonical_json()?; + Ok(BritCid::compute(&bytes)) + } +} diff --git a/brit-epr/src/engine/error.rs b/brit-epr/src/engine/error.rs new file mode 100644 index 00000000000..6c306165c6d --- /dev/null +++ b/brit-epr/src/engine/error.rs @@ -0,0 +1,35 @@ +//! Engine-level error types. + +use thiserror::Error; + +/// Errors raised by the covenant engine's generic layer. +#[derive(Debug, Error)] +pub enum EngineError { + /// Unable to extract a trailer block from a commit body. + #[error("failed to parse trailer block: {0}")] + TrailerBlockParse(String), +} + +/// Errors emitted by schema validation. App schemas return this type from +/// `AppSchema::validate_pair` and `AppSchema::validate_set`. +/// +/// Variants are intentionally broad because different app schemas will +/// express different failure modes. A richer error type can layer on top. +#[derive(Debug, Error, PartialEq, Eq)] +pub enum ValidationError { + /// A required trailer key was absent from the set. + #[error("required trailer key missing: {0}")] + MissingKey(String), + + /// A trailer value is present but empty or whitespace-only. + #[error("trailer key {0} has empty value")] + EmptyValue(String), + + /// A trailer value failed a format check (e.g., malformed CID). + #[error("trailer key {0} malformed: {1}")] + MalformedValue(String, String), + + /// Cross-field rule violated. + #[error("trailer set failed cross-field rule: {0}")] + CrossFieldRule(String), +} diff --git a/brit-epr/src/engine/mod.rs b/brit-epr/src/engine/mod.rs new file mode 100644 index 00000000000..986716cedee --- /dev/null +++ b/brit-epr/src/engine/mod.rs @@ -0,0 +1,20 @@ +//! Covenant engine — unconditional layer that knows the trailer format and +//! dispatch contract but not any specific schema vocabulary. + +mod app_schema; +pub mod cid; +pub mod content_node; +mod error; +pub mod object_store; +pub mod signing; +mod trailer_block; +mod trailer_set; + +pub use app_schema::AppSchema; +pub use cid::{BritCid, CidParseError}; +pub use content_node::ContentNode; +pub use error::{EngineError, ValidationError}; +pub use object_store::{LocalObjectStore, ObjectStoreError}; +pub use signing::{verify_signature, verify_signed_node, AgentKey, AgentKeyError, Signed}; +pub use trailer_block::parse_trailer_block; +pub use trailer_set::TrailerSet; diff --git a/brit-epr/src/engine/object_store.rs b/brit-epr/src/engine/object_store.rs new file mode 100644 index 00000000000..b902f1bc145 --- /dev/null +++ b/brit-epr/src/engine/object_store.rs @@ -0,0 +1,84 @@ +//! `LocalObjectStore` — stores ContentNodes as JSON files under +//! `.git/brit/objects/`, addressed by their BritCid. + +use std::fs; +use std::path::PathBuf; +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; + +/// Filesystem-backed content-addressed store. +pub struct LocalObjectStore { + base_dir: PathBuf, +} + +impl LocalObjectStore { + /// Create a store rooted at the given directory. + pub fn new(base_dir: PathBuf) -> Self { + Self { base_dir } + } + + /// Create a store for a git repo by locating `.git/brit/objects/`. + pub fn for_git_dir(git_dir: &std::path::Path) -> Self { + Self::new(git_dir.join("brit").join("objects")) + } + + /// Store a ContentNode. Returns its CID. Idempotent. + pub fn put(&self, node: &T) -> Result { + let json = node.canonical_json().map_err(ObjectStoreError::Serialize)?; + let cid = BritCid::compute(&json); + fs::create_dir_all(&self.base_dir).map_err(ObjectStoreError::Io)?; + let path = self.base_dir.join(cid.as_str()); + // Atomic write: temp file + rename prevents partial writes on crash. + let tmp_path = self.base_dir.join(format!("{}.tmp", cid.as_str())); + fs::write(&tmp_path, &json).map_err(ObjectStoreError::Io)?; + fs::rename(&tmp_path, &path).map_err(ObjectStoreError::Io)?; + Ok(cid) + } + + /// Retrieve a ContentNode by CID. + pub fn get(&self, cid: &BritCid) -> Result { + let path = self.base_dir.join(cid.as_str()); + let bytes = fs::read(&path).map_err(|e| { + if e.kind() == std::io::ErrorKind::NotFound { + ObjectStoreError::NotFound(cid.clone()) + } else { + ObjectStoreError::Io(e) + } + })?; + serde_json::from_slice(&bytes).map_err(ObjectStoreError::Deserialize) + } + + /// List all stored CIDs. + pub fn list(&self) -> Result, ObjectStoreError> { + if !self.base_dir.exists() { + return Ok(Vec::new()); + } + let mut cids = Vec::new(); + for entry in fs::read_dir(&self.base_dir).map_err(ObjectStoreError::Io)? { + let entry = entry.map_err(ObjectStoreError::Io)?; + if let Some(name) = entry.file_name().to_str() { + if let Ok(cid) = name.parse::() { + cids.push(cid); + } + } + } + Ok(cids) + } +} + +/// Errors from the local object store. +#[derive(Debug, thiserror::Error)] +pub enum ObjectStoreError { + /// Filesystem error. + #[error("I/O error: {0}")] + Io(#[from] std::io::Error), + /// Serialization failed. + #[error("serialization error: {0}")] + Serialize(serde_json::Error), + /// Deserialization failed. + #[error("deserialization error: {0}")] + Deserialize(serde_json::Error), + /// Object not found. + #[error("object not found: {0}")] + NotFound(BritCid), +} diff --git a/brit-epr/src/engine/signing.rs b/brit-epr/src/engine/signing.rs new file mode 100644 index 00000000000..2390c1bac6a --- /dev/null +++ b/brit-epr/src/engine/signing.rs @@ -0,0 +1,173 @@ +//! Agent signing — ed25519 keypair management for attestation signatures. + +use std::fs; +use std::path::{Path, PathBuf}; +use ed25519_dalek::{Signer, SigningKey, VerifyingKey}; + +/// An agent's signing identity, loaded from or generated to a file. +pub struct AgentKey { + signing_key: SigningKey, + key_path: PathBuf, +} + +impl AgentKey { + /// Load an existing key or generate a new one at the given path. + pub fn load_or_generate(key_path: &Path) -> Result { + if key_path.exists() { + Self::load(key_path) + } else { + Self::generate(key_path) + } + } + + /// Load from an existing 32-byte seed file. + pub fn load(key_path: &Path) -> Result { + let bytes = fs::read(key_path).map_err(AgentKeyError::Io)?; + if bytes.len() != 32 { + return Err(AgentKeyError::InvalidKeyLength(bytes.len())); + } + let seed: [u8; 32] = bytes.try_into().map_err(|_| AgentKeyError::InvalidKeyLength(0))?; + let signing_key = SigningKey::from_bytes(&seed); + Ok(Self { signing_key, key_path: key_path.to_path_buf() }) + } + + /// Generate a new keypair and write the 32-byte seed to disk. + pub fn generate(key_path: &Path) -> Result { + let mut rng = rand::thread_rng(); + let signing_key = SigningKey::generate(&mut rng); + if let Some(parent) = key_path.parent() { + fs::create_dir_all(parent).map_err(AgentKeyError::Io)?; + } + fs::write(key_path, signing_key.to_bytes()).map_err(AgentKeyError::Io)?; + // Restrict key file to owner-only read/write (0600) — the seed is a + // private key and must not be readable by other users on shared CI. + #[cfg(unix)] + { + use std::os::unix::fs::PermissionsExt; + fs::set_permissions(key_path, fs::Permissions::from_mode(0o600)) + .map_err(AgentKeyError::Io)?; + } + Ok(Self { signing_key, key_path: key_path.to_path_buf() }) + } + + /// Sign arbitrary bytes. Returns the 64-byte ed25519 signature as hex. + pub fn sign(&self, payload: &[u8]) -> String { + let sig = self.signing_key.sign(payload); + hex::encode(sig.to_bytes()) + } + + /// The agent's public key as a 64-character hex string. + pub fn agent_id(&self) -> String { + hex::encode(self.signing_key.verifying_key().to_bytes()) + } + + /// The verifying (public) key. + pub fn verifying_key(&self) -> VerifyingKey { + self.signing_key.verifying_key() + } + + /// Path where the key is stored. + pub fn key_path(&self) -> &Path { + &self.key_path + } +} + +/// Trait for ContentNodes that carry a signature field. +/// +/// Implementors provide access to the signature and agent_id, plus the +/// ability to produce an unsigned clone for verification. +pub trait Signed: crate::engine::content_node::ContentNode + Clone { + /// The hex-encoded signature. + fn signature(&self) -> &str; + /// The hex-encoded agent public key. + fn agent_id(&self) -> &str; + /// Return a clone with the signature field set to empty string. + fn without_signature(&self) -> Self; +} + +/// Verify a signed ContentNode's signature. +/// +/// Zeros the signature field, computes canonical JSON, and verifies against +/// the agent_id embedded in the node. +pub fn verify_signed_node(node: &T) -> Result { + let unsigned = node.without_signature(); + let canonical = unsigned + .canonical_json() + .map_err(|e| AgentKeyError::Io(std::io::Error::other(e)))?; + verify_signature(&canonical, node.signature(), node.agent_id()) +} + +/// Verify a hex-encoded signature against a hex-encoded public key. +pub fn verify_signature( + payload: &[u8], + signature_hex: &str, + pubkey_hex: &str, +) -> Result { + let sig_bytes = hex::decode(signature_hex).map_err(|_| AgentKeyError::InvalidSignatureHex)?; + let sig = ed25519_dalek::Signature::from_slice(&sig_bytes) + .map_err(|_| AgentKeyError::InvalidSignatureHex)?; + let pub_bytes = hex::decode(pubkey_hex).map_err(|_| AgentKeyError::InvalidPubkeyHex)?; + let pubkey = VerifyingKey::from_bytes( + &pub_bytes.try_into().map_err(|_| AgentKeyError::InvalidPubkeyHex)?, + ).map_err(|_| AgentKeyError::InvalidPubkeyHex)?; + Ok(pubkey.verify_strict(payload, &sig).is_ok()) +} + +/// Agent key errors. +#[derive(Debug, thiserror::Error)] +pub enum AgentKeyError { + /// Filesystem error. + #[error("I/O error: {0}")] + Io(std::io::Error), + /// Key file has wrong length. + #[error("expected 32-byte key seed, got {0} bytes")] + InvalidKeyLength(usize), + /// Signature hex is invalid. + #[error("invalid signature hex")] + InvalidSignatureHex, + /// Public key hex is invalid. + #[error("invalid public key hex")] + InvalidPubkeyHex, +} + +#[cfg(test)] +mod tests { + use super::*; + use tempfile::TempDir; + + #[test] + fn generate_load_roundtrip() { + let tmp = TempDir::new().unwrap(); + let path = tmp.path().join("brit").join("agent-key"); + let key1 = AgentKey::generate(&path).unwrap(); + let key2 = AgentKey::load(&path).unwrap(); + assert_eq!(key1.agent_id(), key2.agent_id()); + } + + #[test] + fn sign_and_verify() { + let tmp = TempDir::new().unwrap(); + let key = AgentKey::generate(&tmp.path().join("key")).unwrap(); + let payload = b"attestation payload"; + let sig = key.sign(payload); + assert!(verify_signature(payload, &sig, &key.agent_id()).unwrap()); + } + + #[test] + fn wrong_payload_fails_verify() { + let tmp = TempDir::new().unwrap(); + let key = AgentKey::generate(&tmp.path().join("key")).unwrap(); + let sig = key.sign(b"original"); + assert!(!verify_signature(b"tampered", &sig, &key.agent_id()).unwrap()); + } + + #[test] + fn load_or_generate_creates_if_missing() { + let tmp = TempDir::new().unwrap(); + let path = tmp.path().join("agent-key"); + assert!(!path.exists()); + let key = AgentKey::load_or_generate(&path).unwrap(); + assert!(path.exists()); + assert_eq!(key.agent_id().len(), 64); + } +} diff --git a/brit-epr/src/engine/trailer_block.rs b/brit-epr/src/engine/trailer_block.rs new file mode 100644 index 00000000000..9c01c418381 --- /dev/null +++ b/brit-epr/src/engine/trailer_block.rs @@ -0,0 +1,29 @@ +//! `parse_trailer_block` — extract a commit's RFC-822-style trailer block +//! into a `TrailerSet`. Wraps `gix_object::commit::message::BodyRef::trailers()`. + +use gix_object::bstr::ByteSlice; +use gix_object::commit::message::BodyRef; + +use crate::engine::TrailerSet; + +/// Parse a commit body's bytes into a `TrailerSet`. +/// +/// The body is the message *after* the commit headers (author, committer, +/// tree, parent lines) — i.e., what gitoxide calls "the body" of a commit. +/// This function extracts the final trailing block of `Key: value` lines +/// (if any) and records each as an entry in a `TrailerSet`, preserving +/// insertion order. +/// +/// Returns an empty `TrailerSet` if the body has no trailer block. +pub fn parse_trailer_block(body: &[u8]) -> TrailerSet { + let body_ref = BodyRef::from_bytes(body); + let mut set = TrailerSet::new(); + + for trailer in body_ref.trailers() { + let key = String::from_utf8_lossy(trailer.token.as_bytes()).into_owned(); + let value = String::from_utf8_lossy(trailer.value.as_bytes()).into_owned(); + set.push(key, value); + } + + set +} diff --git a/brit-epr/src/engine/trailer_set.rs b/brit-epr/src/engine/trailer_set.rs new file mode 100644 index 00000000000..d3b19ca8ae4 --- /dev/null +++ b/brit-epr/src/engine/trailer_set.rs @@ -0,0 +1,69 @@ +//! `TrailerSet` — ordered, duplicate-aware key/value pairs from a commit +//! trailer block. Preserves insertion order for roundtrip-compatible +//! rendering. + +use std::fmt; + +/// A commit trailer block, parsed into ordered key/value pairs. +/// +/// Order is preserved because the engine must be able to re-render the +/// trailer block byte-identically for signing and round-trip use cases. +/// Duplicate keys are allowed (e.g., multiple `Signed-off-by:` or +/// repeatable app-schema keys like `Built-By:`). +#[derive(Debug, Clone, Default, PartialEq, Eq)] +pub struct TrailerSet { + entries: Vec<(String, String)>, +} + +impl TrailerSet { + /// Create an empty set. + pub fn new() -> Self { + Self { entries: Vec::new() } + } + + /// Append a trailer entry, preserving insertion order. + pub fn push(&mut self, key: impl Into, value: impl Into) { + self.entries.push((key.into(), value.into())); + } + + /// Return the first value for a given key, or `None` if absent. + pub fn get(&self, key: &str) -> Option<&str> { + self.entries + .iter() + .find(|(k, _)| k == key) + .map(|(_, v)| v.as_str()) + } + + /// Return all values for a given key (preserves order). + pub fn get_all(&self, key: &str) -> Vec<&str> { + self.entries + .iter() + .filter(|(k, _)| k == key) + .map(|(_, v)| v.as_str()) + .collect() + } + + /// Iterate over all `(key, value)` pairs in insertion order. + pub fn iter(&self) -> impl Iterator { + self.entries.iter().map(|(k, v)| (k.as_str(), v.as_str())) + } + + /// Number of entries. + pub fn len(&self) -> usize { + self.entries.len() + } + + /// True when there are no entries. + pub fn is_empty(&self) -> bool { + self.entries.is_empty() + } +} + +impl fmt::Display for TrailerSet { + fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { + for (k, v) in &self.entries { + writeln!(f, "{k}: {v}")?; + } + Ok(()) + } +} diff --git a/brit-epr/src/lib.rs b/brit-epr/src/lib.rs new file mode 100644 index 00000000000..4e2633db713 --- /dev/null +++ b/brit-epr/src/lib.rs @@ -0,0 +1,47 @@ +//! Elohim Protocol primitives for brit. +//! +//! `brit-epr` has two layers: +//! +//! - **`engine`** — unconditional. The covenant engine: trailer parser, +//! `AppSchema` dispatch trait, `TrailerSet`, validation errors. Does not know +//! which schema is plugged in. A downstream fork can disable the default +//! feature and ship its own app schema on this engine. +//! - **`elohim`** — feature-gated behind `elohim-protocol` (default on). The +//! first-party Elohim Protocol app schema: pillar trailer types (Lamad, +//! Shefa, Qahal), the concrete `ElohimProtocolSchema` implementor, parse +//! and validate convenience functions. +//! +//! The normative specification for the trailer format, pillar meanings, and +//! validation rules lives in `docs/schemas/elohim-protocol-manifest.md` at +//! the root of the brit repository. When this crate and the schema doc +//! disagree, the schema doc wins. + +#![deny(missing_docs, rust_2018_idioms)] +#![forbid(unsafe_code)] + +pub mod engine; + +#[cfg(feature = "elohim-protocol")] +pub mod elohim; + +// Unconditional re-exports +pub use engine::{AppSchema, BritCid, CidParseError, ContentNode, LocalObjectStore, ObjectStoreError, TrailerSet, ValidationError}; + +// Feature-gated re-exports +#[cfg(feature = "elohim-protocol")] +pub use elohim::{ + parse_pillar_trailers, validate_pillar_trailers, ElohimProtocolSchema, PillarTrailers, + PillarValidationError, TrailerKey, +}; + +/// Convenience re-exports for attestation types. +#[cfg(feature = "elohim-protocol")] +pub mod attestation { + pub use crate::elohim::attestation::build::BuildAttestationContentNode; + pub use crate::elohim::attestation::deploy::{DeployAttestationContentNode, HealthStatus}; + pub use crate::elohim::attestation::reach::{compute_reach, ReachInput, ReachLevel}; + pub use crate::elohim::attestation::validation::{ + ValidationAttestationContentNode, ValidationResult, + }; + pub use crate::elohim::refs::BritRefManager; +} diff --git a/brit-epr/tests/attestation_roundtrip.rs b/brit-epr/tests/attestation_roundtrip.rs new file mode 100644 index 00000000000..bb616f87c3b --- /dev/null +++ b/brit-epr/tests/attestation_roundtrip.rs @@ -0,0 +1,127 @@ +use brit_epr::engine::cid::BritCid; +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::signing::{verify_signed_node, AgentKey}; +use brit_epr::elohim::attestation::build::BuildAttestationContentNode; +use brit_epr::elohim::attestation::deploy::{DeployAttestationContentNode, HealthStatus}; +use brit_epr::elohim::attestation::validation::{ValidationAttestationContentNode, ValidationResult}; +use tempfile::TempDir; + +fn sample_cid() -> BritCid { BritCid::compute(b"sample artifact") } + +#[test] +fn build_attestation_roundtrips() { + let node = BuildAttestationContentNode { + manifest_cid: sample_cid(), + step_name: "elohim-edge:cargo-build-storage".into(), + inputs_hash: "abc123def456".into(), + output_cid: BritCid::compute(b"output artifact"), + agent_id: "deadbeef".repeat(4), + hardware_profile: serde_json::json!({"arch": "x86_64", "os": "linux", "memory_gb": 32}), + build_duration_ms: 45_000, + built_at: "2026-04-16T10:00:00Z".into(), + success: true, + signature: "sig_placeholder".into(), + }; + let json = serde_json::to_string_pretty(&node).unwrap(); + let back: BuildAttestationContentNode = serde_json::from_str(&json).unwrap(); + assert_eq!(node, back); + assert_eq!(node.content_type(), "brit.build-attestation"); + let cid1 = node.compute_cid().unwrap(); + let cid2 = back.compute_cid().unwrap(); + assert_eq!(cid1, cid2); +} + +#[test] +fn deploy_attestation_roundtrips() { + let node = DeployAttestationContentNode { + artifact_cid: sample_cid(), + step_name: "elohim-edge:cargo-build-storage".into(), + environment_label: "staging".into(), + endpoint: "https://staging.elohim.host".into(), + health_check_epr: "epr:staging-storage/health".into(), + health_status: HealthStatus::Healthy, + deployed_at: "2026-04-16T10:05:00Z".into(), + attested_at: "2026-04-16T10:05:30Z".into(), + liveness_ttl_sec: 300, + agent_id: "deadbeef".repeat(4), + signature: "sig_placeholder".into(), + }; + let json = serde_json::to_string_pretty(&node).unwrap(); + let back: DeployAttestationContentNode = serde_json::from_str(&json).unwrap(); + assert_eq!(node, back); + assert_eq!(node.content_type(), "brit.deploy-attestation"); +} + +#[test] +fn validation_attestation_roundtrips() { + let node = ValidationAttestationContentNode { + artifact_cid: sample_cid(), + check_name: "sonarqube-scan@v10".into(), + validator_id: "sonarqube-agent-001".into(), + validator_version: "10.7.0".into(), + result: ValidationResult::Pass, + result_summary: "0 bugs, 0 vulnerabilities, 2 code smells".into(), + findings_cid: None, + validated_at: "2026-04-16T10:10:00Z".into(), + ttl_sec: Some(86_400), + signature: "sig_placeholder".into(), + }; + let json = serde_json::to_string_pretty(&node).unwrap(); + let back: ValidationAttestationContentNode = serde_json::from_str(&json).unwrap(); + assert_eq!(node, back); + assert_eq!(node.content_type(), "brit.validation-attestation"); +} + +#[test] +fn validation_result_serializes_as_lowercase() { + assert_eq!(serde_json::to_string(&ValidationResult::Pass).unwrap(), r#""pass""#); + assert_eq!(serde_json::to_string(&ValidationResult::Fail).unwrap(), r#""fail""#); + assert_eq!(serde_json::to_string(&ValidationResult::Warn).unwrap(), r#""warn""#); + assert_eq!(serde_json::to_string(&ValidationResult::Skip).unwrap(), r#""skip""#); +} + +#[test] +fn health_status_serializes_as_lowercase() { + assert_eq!(serde_json::to_string(&HealthStatus::Healthy).unwrap(), r#""healthy""#); + assert_eq!(serde_json::to_string(&HealthStatus::Degraded).unwrap(), r#""degraded""#); + assert_eq!(serde_json::to_string(&HealthStatus::Unreachable).unwrap(), r#""unreachable""#); +} + +#[test] +fn build_attestation_sign_store_retrieve_verify() { + let tmp = TempDir::new().unwrap(); + let key = AgentKey::generate(&tmp.path().join("agent-key")).unwrap(); + let store = brit_epr::engine::object_store::LocalObjectStore::new(tmp.path().join("objects")); + + // Construct with empty signature + let mut node = BuildAttestationContentNode { + manifest_cid: sample_cid(), + step_name: "test:step".into(), + inputs_hash: "inputs-abc".into(), + output_cid: BritCid::compute(b"output"), + agent_id: key.agent_id(), + hardware_profile: serde_json::json!({}), + build_duration_ms: 1000, + built_at: "2026-04-16T12:00:00Z".into(), + success: true, + signature: String::new(), + }; + + // Sign + let canonical = node.canonical_json().unwrap(); + node.signature = key.sign(&canonical); + + // Store + let cid = store.put(&node).unwrap(); + + // Retrieve + let back: BuildAttestationContentNode = store.get(&cid).unwrap(); + + // Verify — must succeed + assert!(verify_signed_node(&back).unwrap(), "signature verification should pass"); + + // Tamper — must fail + let mut tampered = back; + tampered.success = false; + assert!(!verify_signed_node(&tampered).unwrap(), "tampered node should fail verification"); +} diff --git a/brit-epr/tests/elohim_parse.rs b/brit-epr/tests/elohim_parse.rs new file mode 100644 index 00000000000..c1dca1f6260 --- /dev/null +++ b/brit-epr/tests/elohim_parse.rs @@ -0,0 +1,54 @@ +//! Integration tests for elohim pillar trailer parsing. + +use brit_epr::{parse_pillar_trailers, PillarTrailers}; + +fn fixture(name: &str) -> Vec { + let path = format!("tests/fixtures/{}", name); + std::fs::read(&path).unwrap_or_else(|e| panic!("failed to read fixture {path}: {e}")) +} + +#[test] +fn happy_path_all_three_pillars_parse() { + let body = fixture("happy_all_three_pillars.txt"); + let trailers: PillarTrailers = parse_pillar_trailers(&body); + + assert_eq!( + trailers.lamad.as_deref(), + Some("introduces pillar trailer model; first testable EPR primitive") + ); + assert_eq!( + trailers.shefa.as_deref(), + Some("stewardship by @matthew; contributor credit via git author") + ); + assert_eq!( + trailers.qahal.as_deref(), + Some("no governance review required for scaffolding") + ); + assert_eq!(trailers.lamad_node, None); + assert_eq!(trailers.shefa_node, None); + assert_eq!(trailers.qahal_node, None); +} + +#[test] +fn missing_qahal_parses_partially() { + let body = fixture("missing_qahal.txt"); + let trailers = parse_pillar_trailers(&body); + + assert_eq!(trailers.lamad.as_deref(), Some("no knowledge change — pure refactor")); + assert_eq!(trailers.shefa.as_deref(), Some("no value flow — maintenance work")); + assert_eq!(trailers.qahal, None); +} + +#[test] +fn malformed_shefa_node_stored_as_raw_string() { + let body = fixture("malformed_shefa_node.txt"); + let trailers = parse_pillar_trailers(&body); + + assert_eq!(trailers.lamad.as_deref(), Some("teaches the permissive parser behavior")); + assert_eq!(trailers.shefa.as_deref(), Some("value summary is fine")); + assert_eq!(trailers.qahal.as_deref(), Some("governance review complete")); + + // Phase 1 is permissive — stores raw string without parsing. + // Phase 2 will add typed CID parsing and reject malformed values. + assert_eq!(trailers.shefa_node.as_deref(), Some("not-a-valid-cid-at-all")); +} diff --git a/brit-epr/tests/elohim_validate.rs b/brit-epr/tests/elohim_validate.rs new file mode 100644 index 00000000000..0d80235345e --- /dev/null +++ b/brit-epr/tests/elohim_validate.rs @@ -0,0 +1,53 @@ +//! Integration tests for elohim pillar structural validation. + +use brit_epr::{validate_pillar_trailers, PillarTrailers, PillarValidationError, TrailerKey}; + +fn complete() -> PillarTrailers { + PillarTrailers { + lamad: Some("knowledge summary".into()), + shefa: Some("economic summary".into()), + qahal: Some("governance summary".into()), + lamad_node: None, + shefa_node: None, + qahal_node: None, + } +} + +#[test] +fn all_three_present_validates_ok() { + assert_eq!(validate_pillar_trailers(&complete()), Ok(())); +} + +#[test] +fn missing_lamad_fails_with_missing_pillar() { + let mut t = complete(); + t.lamad = None; + assert_eq!( + validate_pillar_trailers(&t), + Err(PillarValidationError::MissingPillar(TrailerKey::Lamad)) + ); +} + +#[test] +fn empty_shefa_fails_with_empty_pillar() { + let mut t = complete(); + t.shefa = Some(" ".into()); + assert_eq!( + validate_pillar_trailers(&t), + Err(PillarValidationError::EmptyPillar(TrailerKey::Shefa)) + ); +} + +#[test] +fn returns_first_error_in_canonical_order() { + let t = PillarTrailers { + lamad: None, + shefa: Some("ok".into()), + qahal: None, + ..Default::default() + }; + assert_eq!( + validate_pillar_trailers(&t), + Err(PillarValidationError::MissingPillar(TrailerKey::Lamad)) + ); +} diff --git a/brit-epr/tests/engine_parsing.rs b/brit-epr/tests/engine_parsing.rs new file mode 100644 index 00000000000..39fc186e5f1 --- /dev/null +++ b/brit-epr/tests/engine_parsing.rs @@ -0,0 +1,33 @@ +//! Engine-level tests — trailer block extraction, no app-schema semantics. + +use brit_epr::engine::{parse_trailer_block, TrailerSet}; + +#[test] +fn extracts_trailer_block_from_commit_body() { + let body = b"\ +Add pillar trailer parser + +Wires gix-object into the covenant engine so trailer blocks can be +extracted into a schema-agnostic TrailerSet. + +Signed-off-by: Matthew Dowell +Lamad: introduces pillar trailer model +Shefa: stewardship by @matthew +Qahal: no governance review required +"; + + let trailers: TrailerSet = parse_trailer_block(body); + + assert_eq!(trailers.len(), 4, "expected 4 trailers, got {}", trailers.len()); + assert_eq!(trailers.get("Signed-off-by"), Some("Matthew Dowell ")); + assert_eq!(trailers.get("Lamad"), Some("introduces pillar trailer model")); + assert_eq!(trailers.get("Shefa"), Some("stewardship by @matthew")); + assert_eq!(trailers.get("Qahal"), Some("no governance review required")); +} + +#[test] +fn empty_trailer_block_returns_empty_set() { + let body = b"Commit with no trailers at all, just a body."; + let trailers = parse_trailer_block(body); + assert_eq!(trailers.len(), 0); +} diff --git a/brit-epr/tests/fixtures/happy_all_three_pillars.txt b/brit-epr/tests/fixtures/happy_all_three_pillars.txt new file mode 100644 index 00000000000..134fef62aa4 --- /dev/null +++ b/brit-epr/tests/fixtures/happy_all_three_pillars.txt @@ -0,0 +1,9 @@ +Add pillar trailer parser + +Wires gix-object::BodyRef::trailers() into the brit-epr engine so +commit messages can carry Lamad / Shefa / Qahal values natively. + +Signed-off-by: Matthew Dowell +Lamad: introduces pillar trailer model; first testable EPR primitive +Shefa: stewardship by @matthew; contributor credit via git author +Qahal: no governance review required for scaffolding diff --git a/brit-epr/tests/fixtures/malformed_shefa_node.txt b/brit-epr/tests/fixtures/malformed_shefa_node.txt new file mode 100644 index 00000000000..42c7c138f2b --- /dev/null +++ b/brit-epr/tests/fixtures/malformed_shefa_node.txt @@ -0,0 +1,6 @@ +Test permissive parser behavior for malformed node ref + +Lamad: teaches the permissive parser behavior +Shefa: value summary is fine +Shefa-Node: not-a-valid-cid-at-all +Qahal: governance review complete diff --git a/brit-epr/tests/fixtures/missing_qahal.txt b/brit-epr/tests/fixtures/missing_qahal.txt new file mode 100644 index 00000000000..0ef35016da7 --- /dev/null +++ b/brit-epr/tests/fixtures/missing_qahal.txt @@ -0,0 +1,4 @@ +Routine refactor with only two pillars declared + +Lamad: no knowledge change — pure refactor +Shefa: no value flow — maintenance work diff --git a/brit-epr/tests/object_store.rs b/brit-epr/tests/object_store.rs new file mode 100644 index 00000000000..9bf5fc7c67f --- /dev/null +++ b/brit-epr/tests/object_store.rs @@ -0,0 +1,59 @@ +use brit_epr::engine::cid::BritCid; +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use serde::{Deserialize, Serialize}; +use tempfile::TempDir; + +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)] +struct TestNode { + name: String, + value: u32, +} + +impl ContentNode for TestNode { + fn content_type(&self) -> &'static str { + "test.node" + } +} + +#[test] +fn put_then_get_roundtrips() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + let node = TestNode { name: "hello".into(), value: 42 }; + let cid = store.put(&node).unwrap(); + let back: TestNode = store.get(&cid).unwrap(); + assert_eq!(node, back); +} + +#[test] +fn same_content_same_cid() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + let node = TestNode { name: "deterministic".into(), value: 7 }; + let cid1 = store.put(&node).unwrap(); + let cid2 = store.put(&node).unwrap(); + assert_eq!(cid1, cid2); +} + +#[test] +fn get_missing_cid_returns_error() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + let fake_cid = BritCid::compute(b"does not exist"); + let result = store.get::(&fake_cid); + assert!(result.is_err()); +} + +#[test] +fn list_returns_all_stored_cids() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + let a = store.put(&TestNode { name: "a".into(), value: 1 }).unwrap(); + let b = store.put(&TestNode { name: "b".into(), value: 2 }).unwrap(); + let mut cids = store.list().unwrap(); + cids.sort_by(|x, y| x.as_str().cmp(y.as_str())); + let mut expected = vec![a, b]; + expected.sort_by(|x, y| x.as_str().cmp(y.as_str())); + assert_eq!(cids, expected); +} diff --git a/brit-epr/tests/reach_computation.rs b/brit-epr/tests/reach_computation.rs new file mode 100644 index 00000000000..eabe1612051 --- /dev/null +++ b/brit-epr/tests/reach_computation.rs @@ -0,0 +1,33 @@ +use brit_epr::elohim::attestation::reach::{compute_reach, ReachInput, ReachLevel}; + +#[test] +fn no_attestations_returns_unknown() { + let input = ReachInput { build_attestations: vec![], deploy_attestations: vec![], validation_attestations: vec![] }; + assert_eq!(compute_reach(&input), ReachLevel::Unknown); +} + +#[test] +fn build_only_returns_built() { + let input = ReachInput { build_attestations: vec!["agent-a".into()], deploy_attestations: vec![], validation_attestations: vec![] }; + assert_eq!(compute_reach(&input), ReachLevel::Built); +} + +#[test] +fn build_plus_deploy_returns_deployed() { + let input = ReachInput { build_attestations: vec!["agent-a".into()], deploy_attestations: vec!["staging".into()], validation_attestations: vec![] }; + assert_eq!(compute_reach(&input), ReachLevel::Deployed); +} + +#[test] +fn build_plus_deploy_plus_validation_returns_verified() { + let input = ReachInput { build_attestations: vec!["agent-a".into()], deploy_attestations: vec!["staging".into()], validation_attestations: vec!["sonarqube-scan@v10".into()] }; + assert_eq!(compute_reach(&input), ReachLevel::Verified); +} + +#[test] +fn same_inputs_same_result() { + let input = ReachInput { build_attestations: vec!["agent-a".into(), "agent-b".into()], deploy_attestations: vec!["staging".into()], validation_attestations: vec!["trivy@latest".into(), "sonarqube@v10".into()] }; + let r1 = compute_reach(&input); + let r2 = compute_reach(&input); + assert_eq!(r1, r2, "reach computation must be deterministic"); +} diff --git a/brit-epr/tests/ref_management.rs b/brit-epr/tests/ref_management.rs new file mode 100644 index 00000000000..92d1cac4e35 --- /dev/null +++ b/brit-epr/tests/ref_management.rs @@ -0,0 +1,59 @@ +use brit_epr::elohim::refs::BritRefManager; +use std::process::Command; +use tempfile::TempDir; + +fn init_git_repo() -> TempDir { + let tmp = TempDir::new().unwrap(); + Command::new("git").args(["init", "--initial-branch=main"]).current_dir(tmp.path()).output().unwrap(); + Command::new("git").args(["-c", "user.email=test@test.com", "-c", "user.name=test", "commit", "--allow-empty", "-m", "init"]).current_dir(tmp.path()).output().unwrap(); + tmp +} + +#[test] +fn put_and_get_build_ref() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + let payload = serde_json::json!({"attestationCid": "abc123", "outputCid": "def456", "agentId": "agent001", "builtAt": "2026-04-16T10:00:00Z"}); + mgr.put_build_ref("elohim-edge:storage", "HEAD", &payload).unwrap(); + let got = mgr.get_build_ref("elohim-edge:storage", "HEAD").unwrap(); + assert_eq!(got, Some(payload)); +} + +#[test] +fn get_missing_ref_returns_none() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + let got = mgr.get_build_ref("nonexistent", "HEAD").unwrap(); + assert_eq!(got, None); +} + +#[test] +fn put_and_get_deploy_ref() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + let payload = serde_json::json!({"artifactCid": "abc123", "healthStatus": "healthy"}); + mgr.put_deploy_ref("elohim-edge:storage", "staging", &payload).unwrap(); + let got = mgr.get_deploy_ref("elohim-edge:storage", "staging").unwrap(); + assert_eq!(got, Some(payload)); +} + +#[test] +fn put_and_get_validate_ref() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + let payload = serde_json::json!({"artifactCid": "abc123", "result": "pass"}); + mgr.put_validate_ref("elohim-edge:storage", "sonarqube-scan@v10", &payload).unwrap(); + let got = mgr.get_validate_ref("elohim-edge:storage", "sonarqube-scan@v10").unwrap(); + assert_eq!(got, Some(payload)); +} + +#[test] +fn list_build_refs() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + mgr.put_build_ref("step-a", "HEAD", &serde_json::json!({"a": 1})).unwrap(); + mgr.put_build_ref("step-b", "HEAD", &serde_json::json!({"b": 2})).unwrap(); + let mut refs = mgr.list_build_refs(None).unwrap(); + refs.sort(); + assert_eq!(refs, vec!["step-a", "step-b"]); +} diff --git a/brit-graph/Cargo.toml b/brit-graph/Cargo.toml new file mode 100644 index 00000000000..1d5f4cde7ce --- /dev/null +++ b/brit-graph/Cargo.toml @@ -0,0 +1,32 @@ +lints.workspace = true + +[package] +name = "brit-graph" +version = "0.0.0" +description = "EPR-native graph engine — DAG construction, affected tracking, topological planning" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[lib] +doctest = false + +[dependencies] +brit-epr = { path = "../brit-epr", default-features = false } +globset = { version = "0.4", optional = true } +gix = { version = "0.81", default-features = false, features = ["basic", "blob-diff", "sha1"], optional = true } +petgraph = "0.7" +rustc-hash = "2" +serde = { version = "1", features = ["derive"] } +serde_json = "1" +thiserror = "2" + +[features] +default = [] +repo = ["dep:gix", "dep:globset"] + +[dev-dependencies] +tempfile = "3" +gix = { version = "0.81", default-features = false, features = ["basic", "blob-diff", "sha1"] } diff --git a/brit-graph/src/affected.rs b/brit-graph/src/affected.rs new file mode 100644 index 00000000000..5e04f76870f --- /dev/null +++ b/brit-graph/src/affected.rs @@ -0,0 +1,159 @@ +//! Affected tracking — which nodes are affected and why. +//! +//! Given a set of initially-affected nodes (e.g., "this step's source files changed"), +//! propagate through the graph to find all transitively affected nodes. +//! Each affected node carries `Vec` explaining why it was affected. + +use std::collections::VecDeque; + +use brit_epr::{BritCid, ContentNode}; +use rustc_hash::{FxHashMap, FxHashSet}; + +use crate::graph::{EprGraph, GraphError}; +use crate::traits::GraphConnections; + +/// Why a node was marked as affected. +#[derive(Debug, Clone)] +pub enum AffectedBy { + /// A source file matched an input pattern. + ChangedFile(String), + /// A dependency (upstream in the DAG) was affected. + UpstreamNode(BritCid), + /// A dependent (downstream in the DAG) was affected. + DownstreamNode(BritCid), + /// The content fingerprint of inputs changed. + InputFingerprint, + /// Explicitly marked as always-affected. + AlwaysAffected, +} + +/// How far to propagate through the graph. +#[derive(Debug, Clone, Copy, Default, PartialEq, Eq)] +pub enum PropagationScope { + /// Don't propagate beyond the initial set. + None, + /// Propagate to immediate neighbors only. + Direct, + /// Propagate through the full transitive closure. + #[default] + Deep, +} + +/// Tracks which nodes are affected during graph analysis. +pub struct AffectedTracker<'g, N: ContentNode, E> { + graph: &'g EprGraph, + affected: FxHashMap>, + upstream_scope: PropagationScope, +} + +impl<'g, N: ContentNode, E> AffectedTracker<'g, N, E> { + /// Create a new tracker for the given graph. + pub fn new(graph: &'g EprGraph) -> Self { + Self { + graph, + affected: FxHashMap::default(), + upstream_scope: PropagationScope::Deep, + } + } + + /// Set the upstream propagation scope (dependents of affected nodes). + pub fn set_upstream_scope(&mut self, scope: PropagationScope) { + self.upstream_scope = scope; + } + + /// Mark a node as affected with a given reason. + pub fn mark_affected(&mut self, cid: BritCid, reason: AffectedBy) { + self.affected.entry(cid).or_default().push(reason); + } + + /// Propagate affected state through the graph based on scope settings. + /// + /// "Upstream propagation" means: if node C is affected, and B depends on C, + /// then B is affected too (B is upstream — it's a dependent of C). + /// This matches build semantics: if a leaf changes, everything that + /// depends on it needs rebuilding. + pub fn propagate(&mut self) -> Result<(), GraphError> { + match self.upstream_scope { + PropagationScope::None => Ok(()), + PropagationScope::Direct => self.propagate_direct(), + PropagationScope::Deep => self.propagate_deep(), + } + } + + /// Consume the tracker and produce the final affected set. + pub fn build(self) -> AffectedSet { + AffectedSet { + affected: self.affected, + } + } + + fn propagate_direct(&mut self) -> Result<(), GraphError> { + let initial: Vec = self.affected.keys().cloned().collect(); + for cid in initial { + let dependents = self.graph.dependents_of(&cid)?; + for dep_cid in dependents { + self.affected + .entry(dep_cid) + .or_default() + .push(AffectedBy::UpstreamNode(cid.clone())); + } + } + Ok(()) + } + + fn propagate_deep(&mut self) -> Result<(), GraphError> { + let mut queue: VecDeque = self.affected.keys().cloned().collect(); + let mut visited: FxHashSet = queue.iter().cloned().collect(); + + while let Some(cid) = queue.pop_front() { + let dependents = self.graph.dependents_of(&cid)?; + for dep_cid in dependents { + self.affected + .entry(dep_cid.clone()) + .or_default() + .push(AffectedBy::UpstreamNode(cid.clone())); + if visited.insert(dep_cid.clone()) { + queue.push_back(dep_cid); + } + } + } + Ok(()) + } +} + +/// The result of affected tracking — an immutable set of affected nodes with reasons. +pub struct AffectedSet { + affected: FxHashMap>, +} + +impl AffectedSet { + /// Check if a node is affected. + #[must_use] + pub fn is_affected(&self, cid: &BritCid) -> bool { + self.affected.contains_key(cid) + } + + /// Get the reasons a node was affected. Returns None if not affected. + #[must_use] + pub fn reasons(&self, cid: &BritCid) -> Option<&[AffectedBy]> { + self.affected.get(cid).map(Vec::as_slice) + } + + /// Get all affected CIDs. + #[must_use] + pub fn affected_cids(&self) -> Vec { + self.affected.keys().cloned().collect() + } + + /// Number of affected nodes. + #[must_use] + pub fn len(&self) -> usize { + self.affected.len() + } + + /// Whether the affected set is empty. + #[must_use] + pub fn is_empty(&self) -> bool { + self.affected.is_empty() + } +} diff --git a/brit-graph/src/fingerprint.rs b/brit-graph/src/fingerprint.rs new file mode 100644 index 00000000000..e8ea0fa099f --- /dev/null +++ b/brit-graph/src/fingerprint.rs @@ -0,0 +1,85 @@ +//! Content fingerprinting — deterministic hash over named inputs. +//! +//! A fingerprint is a `BritCid` computed from a sorted map of named inputs. +//! Same inputs always produce the same fingerprint, regardless of insertion order. + +use std::collections::BTreeMap; + +use brit_epr::BritCid; + +/// Errors produced by repo-aware fingerprint construction. +/// +/// Exported regardless of the `repo` feature so downstream code can match +/// on it without conditional compilation. The variants only get instantiated +/// when the `repo` feature is enabled and `ContentFingerprint::from_repo_globs` +/// is called. +#[derive(Debug, thiserror::Error)] +pub enum FingerprintError { + /// Glob pattern compilation failed. + #[error("invalid glob pattern '{pattern}': {message}")] + InvalidGlob { + /// The pattern that failed to compile. + pattern: String, + /// The underlying error message. + message: String, + }, + /// Resolving the commit ref failed. + #[error("failed to resolve commit '{commit}': {message}")] + CommitResolve { + /// The ref or SHA being resolved. + commit: String, + /// The underlying error message. + message: String, + }, + /// Walking the git tree failed. + #[error("tree walk failed: {0}")] + TreeWalk(String), + /// Reading a blob's bytes failed. + #[error("failed to read blob at '{path}': {message}")] + BlobRead { + /// The repo-relative path whose blob couldn't be read. + path: String, + /// The underlying error message. + message: String, + }, + /// Path was not valid UTF-8. + #[error("non-UTF-8 path in tree: {0:?}")] + NonUtf8Path(Vec), +} + +/// A deterministic content fingerprint over named inputs. +#[derive(Debug, Clone)] +pub struct ContentFingerprint { + /// The overall fingerprint CID. + pub cid: BritCid, + /// Individual input hashes (name -> CID of that input's bytes). + pub inputs: BTreeMap, +} + +impl ContentFingerprint { + /// Compute a fingerprint from a map of named inputs. + /// + /// Keys are sorted (BTreeMap guarantees this). Each input's bytes are + /// individually hashed, then all hashes are concatenated with their keys + /// and hashed again to produce the overall fingerprint. + #[must_use] + pub fn compute(inputs: &BTreeMap>) -> Self { + let mut individual: BTreeMap = BTreeMap::new(); + let mut combined = Vec::new(); + + for (name, bytes) in inputs { + let input_cid = BritCid::compute(bytes); + combined.extend_from_slice(name.as_bytes()); + combined.push(0); + combined.extend_from_slice(input_cid.as_str().as_bytes()); + combined.push(0); + individual.insert(name.clone(), input_cid); + } + + let cid = BritCid::compute(&combined); + ContentFingerprint { + cid, + inputs: individual, + } + } +} diff --git a/brit-graph/src/graph.rs b/brit-graph/src/graph.rs new file mode 100644 index 00000000000..32d2880a426 --- /dev/null +++ b/brit-graph/src/graph.rs @@ -0,0 +1,130 @@ +//! `EprGraph` — a content-addressed directed graph. +//! +//! Nodes implement `ContentNode` from brit-epr. Each node is indexed by its +//! `BritCid`. Edges represent dependencies: an edge from A to B means +//! "A depends on B" (B must complete before A). + +use std::collections::HashMap; + +use brit_epr::{BritCid, ContentNode}; +use petgraph::algo::is_cyclic_directed; +use petgraph::graph::{DiGraph, NodeIndex}; + +/// A content-addressed directed graph where nodes implement `ContentNode`. +pub struct EprGraph { + /// Petgraph directed graph. Node weight is the offset index into `node_data`. + inner: DiGraph, + /// Forward map: CID → petgraph NodeIndex. + cid_to_index: HashMap, + /// Reverse map: petgraph NodeIndex → CID (O(1) lookup for traversal). + index_to_cid_map: HashMap, + /// Parallel Vec of node payloads; addressed by the usize stored in `inner`. + node_data: Vec, +} + +/// Errors from graph operations. +#[derive(Debug, thiserror::Error)] +pub enum GraphError { + /// A referenced node CID was not found in the graph. + #[error("node not found: {0}")] + NodeNotFound(BritCid), + /// Failed to compute CID for a node. + #[error("CID computation failed: {0}")] + CidError(#[from] serde_json::Error), +} + +impl EprGraph { + /// Create an empty graph. + pub fn new() -> Self { + Self { + inner: DiGraph::new(), + cid_to_index: HashMap::new(), + index_to_cid_map: HashMap::new(), + node_data: Vec::new(), + } + } + + /// Add a node. If a node with the same CID already exists, this is a no-op. + /// Returns the CID of the node. + pub fn add_node(&mut self, node: N) -> Result { + let cid = node.compute_cid()?; + if self.cid_to_index.contains_key(&cid) { + return Ok(cid); + } + let data_idx = self.node_data.len(); + self.node_data.push(node); + let graph_idx = self.inner.add_node(data_idx); + self.cid_to_index.insert(cid.clone(), graph_idx); + self.index_to_cid_map.insert(graph_idx, cid.clone()); + Ok(cid) + } + + /// Add a directed edge: `from` depends on `to`. + /// No-op if the edge already exists (prevents parallel edges). + pub fn add_edge(&mut self, from: &BritCid, to: &BritCid) -> Result<(), GraphError> + where + E: Default, + { + let from_idx = self.resolve_index(from)?; + let to_idx = self.resolve_index(to)?; + if self.inner.find_edge(from_idx, to_idx).is_none() { + self.inner.add_edge(from_idx, to_idx, E::default()); + } + Ok(()) + } + + /// Get a node by CID. + pub fn get_node(&self, cid: &BritCid) -> Result<&N, GraphError> { + let graph_idx = self.resolve_index(cid)?; + let data_idx = self.inner[graph_idx]; + Ok(&self.node_data[data_idx]) + } + + /// Number of nodes in the graph. + #[must_use] + pub fn node_count(&self) -> usize { + self.cid_to_index.len() + } + + /// Check whether the graph contains any cycles. + #[must_use] + pub fn has_cycle(&self) -> bool { + is_cyclic_directed(&self.inner) + } + + /// Get all node CIDs. + #[must_use] + pub fn cids(&self) -> Vec { + self.cid_to_index.keys().cloned().collect() + } + + /// Check if a CID exists in the graph. + #[must_use] + pub fn contains(&self, cid: &BritCid) -> bool { + self.cid_to_index.contains_key(cid) + } + + /// Access the inner petgraph (for traits that need direct graph access). + pub(crate) fn inner_graph(&self) -> &DiGraph { + &self.inner + } + + /// Resolve a CID to a petgraph NodeIndex. + pub(crate) fn resolve_index(&self, cid: &BritCid) -> Result { + self.cid_to_index + .get(cid) + .copied() + .ok_or_else(|| GraphError::NodeNotFound(cid.clone())) + } + + /// Resolve a petgraph NodeIndex to a CID — O(1) via reverse map. + pub(crate) fn index_to_cid(&self, idx: NodeIndex) -> Option { + self.index_to_cid_map.get(&idx).cloned() + } +} + +impl Default for EprGraph { + fn default() -> Self { + Self::new() + } +} diff --git a/brit-graph/src/lib.rs b/brit-graph/src/lib.rs new file mode 100644 index 00000000000..60d1ed49240 --- /dev/null +++ b/brit-graph/src/lib.rs @@ -0,0 +1,18 @@ +//! brit-graph — EPR-native graph engine. +//! +//! Provides DAG construction with BritCid-keyed nodes, affected tracking +//! with provenance, content fingerprinting, and topological planning. +//! Pure computation — no IO, no git, no network. +//! +//! Any type implementing `ContentNode` (from brit-epr) can be a graph node. + +#![deny(missing_docs, rust_2018_idioms)] +#![forbid(unsafe_code)] + +pub mod graph; +pub mod traits; +pub mod affected; +pub mod fingerprint; +#[cfg(feature = "repo")] +pub mod repo_fingerprint; +pub mod topo; diff --git a/brit-graph/src/repo_fingerprint.rs b/brit-graph/src/repo_fingerprint.rs new file mode 100644 index 00000000000..3c072662c79 --- /dev/null +++ b/brit-graph/src/repo_fingerprint.rs @@ -0,0 +1,205 @@ +//! Repo-aware fingerprint constructor (feature: `repo`). +//! +//! Builds a `ContentFingerprint` from file contents resolved through glob +//! patterns against a git tree at a specific commit. Reads blobs from the +//! tree, NOT the working tree — fingerprints are reproducible across machines +//! given the same commit and patterns. + +use std::collections::{BTreeMap, VecDeque}; + +use globset::{Glob, GlobSetBuilder}; +use gix::bstr::{BStr, BString, ByteSlice, ByteVec}; + +use crate::fingerprint::{ContentFingerprint, FingerprintError}; + +impl ContentFingerprint { + /// Compute a fingerprint by reading files from a git tree at a specific + /// commit, matching against the given glob patterns. + /// + /// Files are read from the git tree (not the working tree) for + /// reproducibility. Same commit + same patterns = same fingerprint, + /// regardless of working-tree state. + /// + /// Submodule entries and symlinks in the tree are skipped (only regular + /// blobs and executable blobs are included). Empty pattern set or no + /// matching files produces a stable empty-input fingerprint. + pub fn from_repo_globs( + repo: &gix::Repository, + commit_id: gix::ObjectId, + patterns: &[String], + ) -> Result { + // Step A: build the GlobSet + let mut builder = GlobSetBuilder::new(); + for pattern in patterns { + let glob = Glob::new(pattern).map_err(|e| FingerprintError::InvalidGlob { + pattern: pattern.clone(), + message: e.to_string(), + })?; + builder.add(glob); + } + let globset = builder + .build() + .map_err(|e| FingerprintError::InvalidGlob { + pattern: patterns.join(", "), + message: e.to_string(), + })?; + + // Step B: open the tree at the commit + let object = repo + .find_object(commit_id) + .map_err(|e| FingerprintError::CommitResolve { + commit: commit_id.to_hex().to_string(), + message: e.to_string(), + })?; + let commit = object + .try_into_commit() + .map_err(|e| FingerprintError::CommitResolve { + commit: commit_id.to_hex().to_string(), + message: format!("not a commit: {e}"), + })?; + let tree = commit + .tree() + .map_err(|e| FingerprintError::TreeWalk(e.to_string()))?; + + // Step C: walk the tree, collect matching (path, blob_bytes) + let mut inputs: BTreeMap> = BTreeMap::new(); + let mut walk_errors: Vec = Vec::new(); + + let mut visitor = TreeCollector { + repo, + globset: &globset, + inputs: &mut inputs, + errors: &mut walk_errors, + path: BString::default(), + path_deque: VecDeque::new(), + }; + + tree.traverse() + .breadthfirst(&mut visitor) + .map_err(|e| FingerprintError::TreeWalk(e.to_string()))?; + + if !walk_errors.is_empty() { + return Err(FingerprintError::TreeWalk(walk_errors.join("; "))); + } + + // Step D: delegate to existing pure compute + Ok(Self::compute(&inputs)) + } +} + +/// Visitor that walks a tree, matches paths against globs, and collects +/// blob contents for matching files. +/// +/// Path tracking mirrors `gix`'s own `Recorder` — a deque holds the full +/// path snapshot for each pending subtree so that breadth-first traversal +/// can restore the correct prefix when it descends into a subdirectory. +struct TreeCollector<'a> { + repo: &'a gix::Repository, + globset: &'a globset::GlobSet, + inputs: &'a mut BTreeMap>, + errors: &'a mut Vec, + /// The current path prefix (built up as we traverse). + path: BString, + /// Snapshot queue: one entry per pending subtree (breadth-first). + path_deque: VecDeque, +} + +impl<'a> TreeCollector<'a> { + fn push_element(&mut self, component: &BStr) { + if !self.path.is_empty() { + self.path.push(b'/'); + } + self.path.push_str(component); + } + + fn pop_element(&mut self) { + if let Some(pos) = self.path.rfind_byte(b'/') { + self.path.resize(pos, 0); + } else { + self.path.clear(); + } + } +} + +impl<'a> gix::traverse::tree::Visit for TreeCollector<'a> { + /// Restore path from the back of the deque (depth-first; not used here). + fn pop_back_tracked_path_and_set_current(&mut self) { + self.path = self.path_deque.pop_back().unwrap_or_default(); + } + + /// Restore path from the front of the deque (breadth-first descent). + fn pop_front_tracked_path_and_set_current(&mut self) { + self.path = self + .path_deque + .pop_front() + .expect("every push_back_tracked_path_component has a matching pop_front"); + } + + /// Called for tree entries that will be queued for later traversal. + /// Save the current path (with the directory component) so it can be + /// restored when we descend into this subtree. + fn push_back_tracked_path_component(&mut self, component: &BStr) { + self.push_element(component); + self.path_deque.push_back(self.path.clone()); + } + + /// Called for every entry (tree or nontree) before visit_*. + fn push_path_component(&mut self, component: &BStr) { + self.push_element(component); + } + + /// Called after every entry (tree or nontree). + fn pop_path_component(&mut self) { + self.pop_element(); + } + + fn visit_tree( + &mut self, + _entry: &gix::objs::tree::EntryRef<'_>, + ) -> gix::traverse::tree::visit::Action { + // Continue(true) = descend into this subtree + std::ops::ControlFlow::Continue(true) + } + + fn visit_nontree( + &mut self, + entry: &gix::objs::tree::EntryRef<'_>, + ) -> gix::traverse::tree::visit::Action { + // Skip submodules and symlinks + if !matches!( + entry.mode.kind(), + gix::objs::tree::EntryKind::Blob | gix::objs::tree::EntryKind::BlobExecutable + ) { + return std::ops::ControlFlow::Continue(true); + } + + // At this point self.path already contains the full path (push_path_component + // was called with entry.filename before visit_nontree). + let path_str = match std::str::from_utf8(&self.path) { + Ok(s) => s.to_string(), + Err(_) => { + self.errors + .push(format!("non-utf8 path: {:?}", self.path.as_bstr())); + return std::ops::ControlFlow::Continue(true); + } + }; + + // Glob match against the full path + if !self.globset.is_match(&path_str) { + return std::ops::ControlFlow::Continue(true); + } + + // Read the blob + match self.repo.find_object(entry.oid) { + Ok(obj) => { + let bytes = obj.data.clone(); + self.inputs.insert(path_str, bytes); + } + Err(e) => { + self.errors.push(format!("read {}: {e}", path_str)); + } + } + + std::ops::ControlFlow::Continue(true) + } +} diff --git a/brit-graph/src/topo.rs b/brit-graph/src/topo.rs new file mode 100644 index 00000000000..b2d492df5c3 --- /dev/null +++ b/brit-graph/src/topo.rs @@ -0,0 +1,104 @@ +//! Topological planning — sort affected nodes into parallelizable levels. +//! +//! Level 0: nodes with no unmet dependencies (leaves). +//! Level 1: nodes whose dependencies are all in level 0. +//! And so on. Nodes within a level can execute in parallel. + +use brit_epr::{BritCid, ContentNode}; +use petgraph::Direction; +use rustc_hash::{FxHashMap, FxHashSet}; + +use crate::graph::{EprGraph, GraphError}; + +/// A topological execution plan grouped by dependency level. +#[derive(Debug, Clone)] +pub struct TopoPlan { + /// Each inner vec is a set of nodes that can execute in parallel. + /// levels[0] has no dependencies, levels[1] depends only on levels[0], etc. + pub levels: Vec>, +} + +impl TopoPlan { + /// Build a topological plan from a set of affected CIDs within a graph. + /// + /// Only includes nodes that appear in `affected_cids`. Dependencies between + /// affected nodes determine the level grouping. Dependencies on non-affected + /// nodes are treated as already satisfied. + pub fn from_affected( + graph: &EprGraph, + affected_cids: &[BritCid], + ) -> Result { + if affected_cids.is_empty() { + return Ok(TopoPlan { levels: vec![] }); + } + + let affected_set: FxHashSet = affected_cids.iter().cloned().collect(); + + // Compute in-degree for each affected node (only counting edges to other affected nodes) + // Outgoing edges = dependencies. In-degree here counts how many affected deps this node has. + let mut in_degree: FxHashMap = FxHashMap::default(); + for cid in &affected_set { + let idx = graph.resolve_index(cid)?; + let count = graph + .inner_graph() + .neighbors_directed(idx, Direction::Outgoing) + .filter(|&neighbor| { + graph + .index_to_cid(neighbor) + .is_some_and(|c| affected_set.contains(&c)) + }) + .count(); + in_degree.insert(cid.clone(), count); + } + + // Kahn's algorithm with level tracking + let mut levels = Vec::new(); + let mut current_level: Vec = in_degree + .iter() + .filter(|(_, °)| deg == 0) + .map(|(cid, _)| cid.clone()) + .collect(); + + while !current_level.is_empty() { + let mut next_level: Vec = Vec::new(); + + for cid in ¤t_level { + let idx = graph.resolve_index(cid)?; + // Find affected nodes that depend on this node (incoming edges = dependents) + for neighbor in graph.inner_graph().neighbors_directed(idx, Direction::Incoming) { + if let Some(neighbor_cid) = graph.index_to_cid(neighbor) { + if let Some(deg) = in_degree.get_mut(&neighbor_cid) { + *deg = deg.saturating_sub(1); + if *deg == 0 { + next_level.push(neighbor_cid); + } + } + } + } + } + + levels.push(current_level); + current_level = next_level; + } + + Ok(TopoPlan { levels }) + } + + /// Total number of nodes across all levels. + #[must_use] + pub fn total_nodes(&self) -> usize { + self.levels.iter().map(Vec::len).sum() + } + + /// Flatten into a single ordered vec (level 0 first, then level 1, etc). + #[must_use] + pub fn flatten(&self) -> Vec { + self.levels.iter().flat_map(|l| l.iter().cloned()).collect() + } + + /// Whether the plan is empty (no affected nodes). + #[must_use] + pub fn is_empty(&self) -> bool { + self.levels.is_empty() + } +} diff --git a/brit-graph/src/traits.rs b/brit-graph/src/traits.rs new file mode 100644 index 00000000000..4785e933077 --- /dev/null +++ b/brit-graph/src/traits.rs @@ -0,0 +1,86 @@ +//! Graph traversal traits — dependencies_of, dependents_of, and deep variants. + +use std::collections::VecDeque; + +use brit_epr::{BritCid, ContentNode}; +use petgraph::Direction; +use rustc_hash::FxHashSet; + +use crate::graph::{EprGraph, GraphError}; + +/// Trait for querying graph relationships. +pub trait GraphConnections { + /// Direct dependencies of a node (outgoing edges). + fn dependencies_of(&self, cid: &BritCid) -> Result, GraphError>; + + /// Direct dependents of a node (incoming edges). + fn dependents_of(&self, cid: &BritCid) -> Result, GraphError>; + + /// All transitive dependencies (deep). + fn deep_dependencies_of(&self, cid: &BritCid) -> Result, GraphError>; + + /// All transitive dependents (deep). + fn deep_dependents_of(&self, cid: &BritCid) -> Result, GraphError>; +} + +impl GraphConnections for EprGraph { + fn dependencies_of(&self, cid: &BritCid) -> Result, GraphError> { + let idx = self.resolve_index(cid)?; + let graph = self.inner_graph(); + Ok(graph + .neighbors_directed(idx, Direction::Outgoing) + .filter_map(|neighbor| self.index_to_cid(neighbor)) + .collect()) + } + + fn dependents_of(&self, cid: &BritCid) -> Result, GraphError> { + let idx = self.resolve_index(cid)?; + let graph = self.inner_graph(); + Ok(graph + .neighbors_directed(idx, Direction::Incoming) + .filter_map(|neighbor| self.index_to_cid(neighbor)) + .collect()) + } + + fn deep_dependencies_of(&self, cid: &BritCid) -> Result, GraphError> { + self.traverse_deep(cid, Direction::Outgoing) + } + + fn deep_dependents_of(&self, cid: &BritCid) -> Result, GraphError> { + self.traverse_deep(cid, Direction::Incoming) + } +} + +impl EprGraph { + fn traverse_deep( + &self, + start: &BritCid, + direction: Direction, + ) -> Result, GraphError> { + let start_idx = self.resolve_index(start)?; + let graph = self.inner_graph(); + let mut visited = FxHashSet::default(); + let mut result = Vec::new(); + let mut queue = VecDeque::new(); + + for neighbor in graph.neighbors_directed(start_idx, direction) { + queue.push_back(neighbor); + } + + while let Some(idx) = queue.pop_front() { + if !visited.insert(idx) { + continue; + } + if let Some(cid) = self.index_to_cid(idx) { + result.push(cid); + } + for neighbor in graph.neighbors_directed(idx, direction) { + if !visited.contains(&neighbor) { + queue.push_back(neighbor); + } + } + } + + Ok(result) + } +} diff --git a/brit-graph/tests/affected_tracking.rs b/brit-graph/tests/affected_tracking.rs new file mode 100644 index 00000000000..806e9df4ed5 --- /dev/null +++ b/brit-graph/tests/affected_tracking.rs @@ -0,0 +1,100 @@ +use brit_epr::{BritCid, ContentNode}; +use brit_graph::affected::{AffectedBy, AffectedTracker, PropagationScope}; +use brit_graph::graph::EprGraph; +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Clone, Serialize, Deserialize)] +struct TestNode { + name: String, +} + +impl ContentNode for TestNode { + fn content_type(&self) -> &'static str { + "test.node" + } +} + +fn three_node_chain() -> (EprGraph, BritCid, BritCid, BritCid) { + let mut graph = EprGraph::new(); + let a = TestNode { name: "aff-a".into() }; // depends on b + let b = TestNode { name: "aff-b".into() }; // depends on c + let c = TestNode { name: "aff-c".into() }; // leaf + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + let cid_c = c.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_node(c).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); + graph.add_edge(&cid_b, &cid_c).unwrap(); + + (graph, cid_a, cid_b, cid_c) +} + +#[test] +fn directly_affected_node_is_tracked() { + let (graph, _, _, cid_c) = three_node_chain(); + let mut tracker = AffectedTracker::new(&graph); + tracker.mark_affected(cid_c.clone(), AffectedBy::ChangedFile("src/lib.rs".into())); + + let affected = tracker.build(); + assert!(affected.is_affected(&cid_c)); + assert!(!affected.is_affected(&BritCid::compute(b"nonexistent"))); +} + +#[test] +fn upstream_propagation_deep() { + let (graph, cid_a, cid_b, cid_c) = three_node_chain(); + // c changed -> b is affected (depends on c) -> a is affected (depends on b) + let mut tracker = AffectedTracker::new(&graph); + tracker.set_upstream_scope(PropagationScope::Deep); + tracker.mark_affected(cid_c.clone(), AffectedBy::ChangedFile("leaf.rs".into())); + tracker.propagate().unwrap(); + + let affected = tracker.build(); + assert!(affected.is_affected(&cid_c)); + assert!(affected.is_affected(&cid_b)); + assert!(affected.is_affected(&cid_a)); +} + +#[test] +fn upstream_propagation_direct() { + let (graph, cid_a, cid_b, cid_c) = three_node_chain(); + let mut tracker = AffectedTracker::new(&graph); + tracker.set_upstream_scope(PropagationScope::Direct); + tracker.mark_affected(cid_c.clone(), AffectedBy::ChangedFile("leaf.rs".into())); + tracker.propagate().unwrap(); + + let affected = tracker.build(); + assert!(affected.is_affected(&cid_c)); + assert!(affected.is_affected(&cid_b)); // direct dependent of c + assert!(!affected.is_affected(&cid_a)); // NOT affected — only direct +} + +#[test] +fn upstream_propagation_none() { + let (graph, cid_a, cid_b, cid_c) = three_node_chain(); + let mut tracker = AffectedTracker::new(&graph); + tracker.set_upstream_scope(PropagationScope::None); + tracker.mark_affected(cid_c.clone(), AffectedBy::ChangedFile("leaf.rs".into())); + tracker.propagate().unwrap(); + + let affected = tracker.build(); + assert!(affected.is_affected(&cid_c)); + assert!(!affected.is_affected(&cid_b)); + assert!(!affected.is_affected(&cid_a)); +} + +#[test] +fn provenance_tracks_why() { + let (graph, _, cid_b, cid_c) = three_node_chain(); + let mut tracker = AffectedTracker::new(&graph); + tracker.set_upstream_scope(PropagationScope::Deep); + tracker.mark_affected(cid_c.clone(), AffectedBy::ChangedFile("leaf.rs".into())); + tracker.propagate().unwrap(); + + let affected = tracker.build(); + let reasons = affected.reasons(&cid_b).unwrap(); + assert!(reasons.iter().any(|r| matches!(r, AffectedBy::UpstreamNode(_)))); +} diff --git a/brit-graph/tests/fingerprint_determinism.rs b/brit-graph/tests/fingerprint_determinism.rs new file mode 100644 index 00000000000..206dda71a16 --- /dev/null +++ b/brit-graph/tests/fingerprint_determinism.rs @@ -0,0 +1,64 @@ +use brit_epr::BritCid; +use brit_graph::fingerprint::ContentFingerprint; +use std::collections::BTreeMap; + +#[test] +fn same_inputs_same_fingerprint() { + let mut inputs = BTreeMap::new(); + inputs.insert("file_a".to_string(), b"content_a".to_vec()); + inputs.insert("file_b".to_string(), b"content_b".to_vec()); + + let fp1 = ContentFingerprint::compute(&inputs); + let fp2 = ContentFingerprint::compute(&inputs); + assert_eq!(fp1.cid, fp2.cid); +} + +#[test] +fn different_inputs_different_fingerprint() { + let mut inputs1 = BTreeMap::new(); + inputs1.insert("file".to_string(), b"v1".to_vec()); + + let mut inputs2 = BTreeMap::new(); + inputs2.insert("file".to_string(), b"v2".to_vec()); + + let fp1 = ContentFingerprint::compute(&inputs1); + let fp2 = ContentFingerprint::compute(&inputs2); + assert_ne!(fp1.cid, fp2.cid); +} + +#[test] +fn insertion_order_does_not_matter() { + let mut inputs1 = BTreeMap::new(); + inputs1.insert("z_file".to_string(), b"z_content".to_vec()); + inputs1.insert("a_file".to_string(), b"a_content".to_vec()); + + let mut inputs2 = BTreeMap::new(); + inputs2.insert("a_file".to_string(), b"a_content".to_vec()); + inputs2.insert("z_file".to_string(), b"z_content".to_vec()); + + let fp1 = ContentFingerprint::compute(&inputs1); + let fp2 = ContentFingerprint::compute(&inputs2); + assert_eq!(fp1.cid, fp2.cid); +} + +#[test] +fn empty_inputs_produce_valid_fingerprint() { + let inputs = BTreeMap::new(); + let fp = ContentFingerprint::compute(&inputs); + assert_eq!(fp.cid.as_str().len(), 64); +} + +#[test] +fn individual_input_cids_are_populated() { + let mut inputs = BTreeMap::new(); + inputs.insert("alpha".to_string(), b"a".to_vec()); + inputs.insert("beta".to_string(), b"b".to_vec()); + + let fp = ContentFingerprint::compute(&inputs); + assert_eq!(fp.inputs.len(), 2); + assert!(fp.inputs.contains_key("alpha")); + assert!(fp.inputs.contains_key("beta")); + // Each individual CID should be the BritCid::compute of the bytes + assert_eq!(fp.inputs["alpha"], BritCid::compute(b"a")); + assert_eq!(fp.inputs["beta"], BritCid::compute(b"b")); +} diff --git a/brit-graph/tests/graph_construction.rs b/brit-graph/tests/graph_construction.rs new file mode 100644 index 00000000000..9a8e4c7ad9e --- /dev/null +++ b/brit-graph/tests/graph_construction.rs @@ -0,0 +1,154 @@ +use brit_epr::{BritCid, ContentNode}; +use brit_graph::graph::EprGraph; +use brit_graph::traits::GraphConnections; +use serde::{Deserialize, Serialize}; + +/// A minimal ContentNode for testing. +#[derive(Debug, Clone, Serialize, Deserialize)] +struct TestNode { + name: String, +} + +impl ContentNode for TestNode { + fn content_type(&self) -> &'static str { + "test.node" + } +} + +#[test] +fn add_and_retrieve_node() { + let mut graph: EprGraph = EprGraph::new(); + let node = TestNode { name: "alpha".into() }; + let cid = node.compute_cid().unwrap(); + graph.add_node(node.clone()).unwrap(); + + let retrieved = graph.get_node(&cid).unwrap(); + assert_eq!(retrieved.name, "alpha"); +} + +#[test] +fn add_edge_between_nodes() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "a".into() }; + let b = TestNode { name: "b".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); // a depends on b + + assert_eq!(graph.node_count(), 2); +} + +#[test] +fn duplicate_node_is_idempotent() { + let mut graph: EprGraph = EprGraph::new(); + let node = TestNode { name: "dup".into() }; + graph.add_node(node.clone()).unwrap(); + graph.add_node(node.clone()).unwrap(); + assert_eq!(graph.node_count(), 1); +} + +#[test] +fn edge_to_missing_node_fails() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "a".into() }; + let cid_a = a.compute_cid().unwrap(); + let missing = BritCid::compute(b"does-not-exist"); + + graph.add_node(a).unwrap(); + let result = graph.add_edge(&cid_a, &missing); + assert!(result.is_err()); +} + +#[test] +fn has_cycle_detects_cycle() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "cycle-a".into() }; + let b = TestNode { name: "cycle-b".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); + graph.add_edge(&cid_b, &cid_a).unwrap(); + + assert!(graph.has_cycle()); +} + +#[test] +fn no_cycle_in_valid_dag() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "dag-a".into() }; + let b = TestNode { name: "dag-b".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); // a -> b (a depends on b) + + assert!(!graph.has_cycle()); +} + +#[test] +fn dependencies_of_returns_direct_deps() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "tr-a".into() }; + let b = TestNode { name: "tr-b".into() }; + let c = TestNode { name: "tr-c".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + let cid_c = c.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_node(c).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); // a depends on b + graph.add_edge(&cid_a, &cid_c).unwrap(); // a depends on c + + let deps = graph.dependencies_of(&cid_a).unwrap(); + assert_eq!(deps.len(), 2); + assert!(deps.contains(&cid_b)); + assert!(deps.contains(&cid_c)); +} + +#[test] +fn dependents_of_returns_direct_dependents() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "dep-a".into() }; + let b = TestNode { name: "dep-b".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); // a depends on b + + let dependents = graph.dependents_of(&cid_b).unwrap(); + assert_eq!(dependents, vec![cid_a]); +} + +#[test] +fn deep_dependencies_of_returns_transitive() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "deep-a".into() }; + let b = TestNode { name: "deep-b".into() }; + let c = TestNode { name: "deep-c".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + let cid_c = c.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_node(c).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); // a -> b + graph.add_edge(&cid_b, &cid_c).unwrap(); // b -> c + + let deep = graph.deep_dependencies_of(&cid_a).unwrap(); + assert_eq!(deep.len(), 2); + assert!(deep.contains(&cid_b)); + assert!(deep.contains(&cid_c)); +} diff --git a/brit-graph/tests/repo_fingerprint.rs b/brit-graph/tests/repo_fingerprint.rs new file mode 100644 index 00000000000..dc7048c9353 --- /dev/null +++ b/brit-graph/tests/repo_fingerprint.rs @@ -0,0 +1,134 @@ +//! Integration tests for ContentFingerprint::from_repo_globs. +//! These run only when the `repo` feature is enabled. + +#![cfg(feature = "repo")] + +use std::collections::BTreeMap; +use std::process::Command; + +use brit_graph::fingerprint::ContentFingerprint; +use tempfile::TempDir; + +/// Initialize a temp git repo with a few files committed. +/// Returns (TempDir keep-alive, repo path, head ObjectId). +fn init_repo_with_files(files: &[(&str, &str)]) -> (TempDir, std::path::PathBuf, gix::ObjectId) { + let dir = TempDir::new().expect("temp"); + let path = dir.path().to_path_buf(); + Command::new("git").args(["init", "-q"]).current_dir(&path).status().expect("init"); + Command::new("git").args(["config", "user.email", "t@t.t"]).current_dir(&path).status().expect(""); + Command::new("git").args(["config", "user.name", "t"]).current_dir(&path).status().expect(""); + + for (rel, contents) in files { + let abs = path.join(rel); + if let Some(parent) = abs.parent() { + std::fs::create_dir_all(parent).expect("mkdir"); + } + std::fs::write(&abs, contents).expect("write"); + Command::new("git").args(["add", rel]).current_dir(&path).status().expect("add"); + } + Command::new("git") + .args(["commit", "-q", "-m", "init"]) + .current_dir(&path) + .status() + .expect("commit"); + + let repo = gix::open(&path).expect("open"); + let head_id = repo.head_id().expect("head_id").detach(); + + (dir, path, head_id) +} + +#[test] +fn empty_patterns_produces_empty_inputs_fingerprint() { + let (_keep, path, head) = init_repo_with_files(&[("a.txt", "hello\n")]); + let repo = gix::open(&path).expect("open"); + let fp = ContentFingerprint::from_repo_globs(&repo, head, &[]).expect("compute"); + assert!(fp.inputs.is_empty(), "no patterns -> no inputs"); + + // Same as compute(empty) + let baseline = ContentFingerprint::compute(&BTreeMap::new()); + assert_eq!(fp.cid, baseline.cid); +} + +#[test] +fn single_pattern_matches_one_file() { + let (_keep, path, head) = init_repo_with_files(&[ + ("src/foo.ts", "console.log('foo');\n"), + ("src/bar.rs", "fn bar() {}\n"), + ("README.md", "# project\n"), + ]); + let repo = gix::open(&path).expect("open"); + + let patterns = vec!["src/**/*.ts".to_string()]; + let fp = ContentFingerprint::from_repo_globs(&repo, head, &patterns).expect("compute"); + + // Only foo.ts should be in the inputs + assert_eq!(fp.inputs.len(), 1, "one .ts file"); + assert!(fp.inputs.contains_key("src/foo.ts"), "found keys: {:?}", fp.inputs.keys().collect::>()); +} + +#[test] +fn deterministic_across_calls_same_inputs() { + let (_keep, path, head) = init_repo_with_files(&[ + ("src/a.ts", "a\n"), + ("src/b.ts", "b\n"), + ]); + let repo = gix::open(&path).expect("open"); + + let patterns = vec!["src/**/*.ts".to_string()]; + let fp1 = ContentFingerprint::from_repo_globs(&repo, head, &patterns).expect("1"); + let fp2 = ContentFingerprint::from_repo_globs(&repo, head, &patterns).expect("2"); + + assert_eq!(fp1.cid, fp2.cid, "deterministic"); + assert_eq!(fp1.inputs.len(), fp2.inputs.len()); +} + +#[test] +fn different_content_different_fingerprint() { + // Same patterns, same paths, different file CONTENT -> different fingerprint. + // This is the property that the OLD pattern-bytes hashing did NOT have. + let (_keep_a, path_a, head_a) = init_repo_with_files(&[("src/foo.ts", "version 1\n")]); + let (_keep_b, path_b, head_b) = init_repo_with_files(&[("src/foo.ts", "version 2\n")]); + + let repo_a = gix::open(&path_a).expect("a"); + let repo_b = gix::open(&path_b).expect("b"); + + let patterns = vec!["src/**/*.ts".to_string()]; + let fp_a = ContentFingerprint::from_repo_globs(&repo_a, head_a, &patterns).expect("a"); + let fp_b = ContentFingerprint::from_repo_globs(&repo_b, head_b, &patterns).expect("b"); + + assert_ne!(fp_a.cid, fp_b.cid, "different content must produce different fingerprint"); +} + +#[test] +fn no_matching_files_is_empty_fingerprint() { + let (_keep, path, head) = init_repo_with_files(&[("README.md", "x")]); + let repo = gix::open(&path).expect("open"); + let patterns = vec!["src/**/*.ts".to_string()]; + let fp = ContentFingerprint::from_repo_globs(&repo, head, &patterns).expect("compute"); + assert!(fp.inputs.is_empty()); +} + +#[test] +fn multiple_patterns_combine() { + let (_keep, path, head) = init_repo_with_files(&[ + ("src/foo.ts", "ts\n"), + ("src/bar.rs", "rs\n"), + ("README.md", "md\n"), + ]); + let repo = gix::open(&path).expect("open"); + let patterns = vec!["src/**/*.ts".to_string(), "src/**/*.rs".to_string()]; + let fp = ContentFingerprint::from_repo_globs(&repo, head, &patterns).expect("compute"); + assert_eq!(fp.inputs.len(), 2); + assert!(fp.inputs.contains_key("src/foo.ts")); + assert!(fp.inputs.contains_key("src/bar.rs")); +} + +#[test] +fn invalid_glob_returns_error() { + let (_keep, path, head) = init_repo_with_files(&[("a.txt", "x")]); + let repo = gix::open(&path).expect("open"); + let patterns = vec!["[invalid".to_string()]; + let err = ContentFingerprint::from_repo_globs(&repo, head, &patterns).unwrap_err(); + assert!(matches!(err, brit_graph::fingerprint::FingerprintError::InvalidGlob { .. })); +} diff --git a/brit-graph/tests/topo_ordering.rs b/brit-graph/tests/topo_ordering.rs new file mode 100644 index 00000000000..b4e98342d69 --- /dev/null +++ b/brit-graph/tests/topo_ordering.rs @@ -0,0 +1,103 @@ +use brit_epr::{BritCid, ContentNode}; +use brit_graph::graph::EprGraph; +use brit_graph::topo::TopoPlan; +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Clone, Serialize, Deserialize)] +struct TestNode { + name: String, +} + +impl ContentNode for TestNode { + fn content_type(&self) -> &'static str { + "test.node" + } +} + +#[test] +fn topo_plan_groups_by_level() { + // c has no deps (level 0) + // b depends on c (level 1) + // a depends on b (level 2) + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "topo-a".into() }; + let b = TestNode { name: "topo-b".into() }; + let c = TestNode { name: "topo-c".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + let cid_c = c.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_node(c).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); + graph.add_edge(&cid_b, &cid_c).unwrap(); + + let affected = vec![cid_a.clone(), cid_b.clone(), cid_c.clone()]; + let plan = TopoPlan::from_affected(&graph, &affected).unwrap(); + + assert_eq!(plan.levels.len(), 3); + assert!(plan.levels[0].contains(&cid_c)); // leaf first + assert!(plan.levels[1].contains(&cid_b)); + assert!(plan.levels[2].contains(&cid_a)); +} + +#[test] +fn topo_plan_parallel_at_same_level() { + // b and c have no deps (level 0, parallelizable) + // a depends on both b and c (level 1) + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "par-a".into() }; + let b = TestNode { name: "par-b".into() }; + let c = TestNode { name: "par-c".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + let cid_c = c.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_node(c).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); + graph.add_edge(&cid_a, &cid_c).unwrap(); + + let affected = vec![cid_a.clone(), cid_b.clone(), cid_c.clone()]; + let plan = TopoPlan::from_affected(&graph, &affected).unwrap(); + + assert_eq!(plan.levels.len(), 2); + assert_eq!(plan.levels[0].len(), 2); // b and c at level 0 + assert!(plan.levels[0].contains(&cid_b)); + assert!(plan.levels[0].contains(&cid_c)); + assert_eq!(plan.levels[1], vec![cid_a]); // a at level 1 +} + +#[test] +fn topo_plan_skips_unaffected() { + let mut graph: EprGraph = EprGraph::new(); + let a = TestNode { name: "skip-a".into() }; + let b = TestNode { name: "skip-b".into() }; + let c = TestNode { name: "skip-c".into() }; + let cid_a = a.compute_cid().unwrap(); + let cid_b = b.compute_cid().unwrap(); + let cid_c = c.compute_cid().unwrap(); + + graph.add_node(a).unwrap(); + graph.add_node(b).unwrap(); + graph.add_node(c).unwrap(); + graph.add_edge(&cid_a, &cid_b).unwrap(); + + // Only b is affected, not c (c is independent) + let affected = vec![cid_b.clone()]; + let plan = TopoPlan::from_affected(&graph, &affected).unwrap(); + + let all_cids: Vec<&BritCid> = plan.levels.iter().flat_map(|l: &Vec| l.iter()).collect(); + assert!(all_cids.contains(&&cid_b)); + assert!(!all_cids.contains(&&cid_c)); +} + +#[test] +fn topo_plan_empty_affected_produces_empty_plan() { + let graph: EprGraph = EprGraph::new(); + let affected: Vec = vec![]; + let plan = TopoPlan::from_affected(&graph, &affected).unwrap(); + assert!(plan.levels.is_empty()); +} diff --git a/brit-verify/Cargo.toml b/brit-verify/Cargo.toml new file mode 100644 index 00000000000..3dde8c6088d --- /dev/null +++ b/brit-verify/Cargo.toml @@ -0,0 +1,19 @@ +lints.workspace = true + +[package] +name = "brit-verify" +version = "0.0.0" +description = "Verify pillar trailers on a git commit — the first brit binary" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[[bin]] +name = "brit-verify" +path = "src/main.rs" + +[dependencies] +brit-epr = { version = "^0.0.0", path = "../brit-epr" } +gix = { version = "^0.81.0", path = "../gix", default-features = false, features = ["revision"] } diff --git a/brit-verify/src/main.rs b/brit-verify/src/main.rs new file mode 100644 index 00000000000..279a5bbc3da --- /dev/null +++ b/brit-verify/src/main.rs @@ -0,0 +1,115 @@ +//! `brit-verify` — verify pillar trailers on a git commit. +//! +//! Usage: `brit-verify [--repo ]` +//! +//! Opens the repository at `` (current directory if omitted), resolves +//! `` to a commit object, extracts the commit message body, +//! parses pillar trailers with brit-epr, runs structural validation, and +//! prints the result. Exits 0 on success, 1 on validation failure, 2 on +//! usage error, 3 on repo error. +//! +//! No clap, no tracing — smallest possible end-to-end proof that parser +//! and validator work against real git objects. + +use std::process::ExitCode; + +use brit_epr::{parse_pillar_trailers, validate_pillar_trailers}; + +fn main() -> ExitCode { + let args: Vec = std::env::args().collect(); + + let (rev, repo_path) = match parse_args(&args) { + Ok(parsed) => parsed, + Err(msg) => { + eprintln!("{msg}\n\nUsage: brit-verify [--repo ]"); + return ExitCode::from(2); + } + }; + + let repo = match gix::discover(&repo_path) { + Ok(repo) => repo, + Err(e) => { + eprintln!("failed to open repo at {repo_path}: {e}"); + return ExitCode::from(3); + } + }; + + let commit = match repo.rev_parse_single(rev.as_str()) { + Ok(id) => match id.object() { + Ok(obj) => match obj.try_into_commit() { + Ok(c) => c, + Err(_) => { + eprintln!("rev {rev} does not point at a commit"); + return ExitCode::from(3); + } + }, + Err(e) => { + eprintln!("failed to load object for {rev}: {e}"); + return ExitCode::from(3); + } + }, + Err(e) => { + eprintln!("failed to resolve rev {rev}: {e}"); + return ExitCode::from(3); + } + }; + + let decoded = match commit.decode() { + Ok(c) => c, + Err(e) => { + eprintln!("failed to decode commit {rev}: {e}"); + return ExitCode::from(3); + } + }; + + // decoded.message is &BStr (the full message including trailing trailers). + // parse_pillar_trailers takes &[u8]; BStr derefs to [u8]. + let trailers = parse_pillar_trailers(decoded.message.as_ref()); + + match validate_pillar_trailers(&trailers) { + Ok(()) => { + println!("✓ pillar trailers valid for {rev}"); + println!(" Lamad: {}", trailers.lamad.as_deref().unwrap_or("-")); + println!(" Shefa: {}", trailers.shefa.as_deref().unwrap_or("-")); + println!(" Qahal: {}", trailers.qahal.as_deref().unwrap_or("-")); + if let Some(ref c) = trailers.lamad_node { + println!(" Lamad-Node: {c}"); + } + if let Some(ref c) = trailers.shefa_node { + println!(" Shefa-Node: {c}"); + } + if let Some(ref c) = trailers.qahal_node { + println!(" Qahal-Node: {c}"); + } + ExitCode::SUCCESS + } + Err(e) => { + eprintln!("✗ pillar validation failed for {rev}: {e}"); + ExitCode::FAILURE + } + } +} + +fn parse_args(args: &[String]) -> Result<(String, String), String> { + if args.len() < 2 { + return Err("missing argument".into()); + } + let rev = args[1].clone(); + let mut repo_path = ".".to_string(); + + let mut i = 2; + while i < args.len() { + match args[i].as_str() { + "--repo" => { + i += 1; + if i >= args.len() { + return Err("--repo requires a path argument".into()); + } + repo_path = args[i].clone(); + i += 1; + } + unknown => return Err(format!("unknown argument: {unknown}")), + } + } + Ok((rev, repo_path)) +} diff --git a/devfile.yaml b/devfile.yaml new file mode 100644 index 00000000000..eadeba96fc9 --- /dev/null +++ b/devfile.yaml @@ -0,0 +1,137 @@ +schemaVersion: 2.2.0 +metadata: + name: brit-devspace + displayName: Brit Development Environment + description: >- + Pure-Rust development environment for brit — the gitoxide fork that adds + Elohim Protocol covenant primitives (Lamad/Shefa/Qahal trailers) to git. + tags: + - Rust + - gitoxide + - brit + - Claude + projectType: rust + language: Rust + version: 1.0.0 + +projects: + - name: brit + git: + remotes: + origin: https://github.com/ethosengine/brit.git + +components: + - name: tools + container: + image: harbor.ethosengine.com/devspaces/rust-nix-dev:latest + memoryLimit: 10Gi + memoryRequest: 2Gi + cpuLimit: '2' + cpuRequest: '500m' + mountSources: true + sourceMapping: /projects + env: + - name: CLAUDE_CONFIG_DIR + value: /projects/.claude-config + - name: USER + value: user + - name: RUST_BACKTRACE + value: '1' + # XDG directories — subdirs under /projects-cache volume + - name: XDG_CACHE_HOME + value: /projects-cache/xdg/cache + - name: XDG_DATA_HOME + value: /projects-cache/xdg/data + - name: XDG_STATE_HOME + value: /projects-cache/xdg/state + - name: XDG_CONFIG_HOME + value: /projects-cache/xdg/config + # Jenkins integration + - name: JENKINS_URL + value: "https://jenkins.ethosengine.com" + # SonarQube MCP (SONARQUBE_TOKEN comes from K8s secret) + - name: STORAGE_PATH + value: /projects/.sonarqube-mcp + - name: SONARQUBE_URL + value: "https://sonarqube.ethosengine.com" + # Rust toolchain (installed to /opt/rust in the base image) + - name: RUSTUP_HOME + value: /opt/rust/rustup + - name: CARGO_HOME + value: /opt/rust/cargo + # Native build: no RUSTFLAGS override — brit is plain cargo, no WASM. + - name: CC + value: gcc + - name: PATH + value: '/opt/rust/cargo/bin:/home/user/bin:/home/user/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin' + volumeMounts: + - name: projects-cache + path: /projects-cache + + - name: projects-cache + volume: + size: 20Gi + +commands: + - id: setup-vscode-cli + exec: + component: tools + commandLine: | + # VS Code CLI symlink (code-oss -> code). Use ubi9 binary (forward-compatible with ubi10). + mkdir -p /home/user/.local/bin + if [ -f /checode/checode-linux-libc/ubi9/bin/remote-cli/code-oss ]; then + ln -sf /checode/checode-linux-libc/ubi9/bin/remote-cli/code-oss /home/user/.local/bin/code + fi + + # Link ~/.claude to persistent config dir for the Claude VS Code extension. + rm -rf /home/user/.claude 2>/dev/null || true + ln -sf "$CLAUDE_CONFIG_DIR" /home/user/.claude + mkdir -p /home/user/.claude/ide + + - id: setup-claude-mcp + exec: + component: tools + workingDir: /projects/brit + commandLine: | + mkdir -p "$STORAGE_PATH" + + # SonarQube MCP + claude mcp remove sonarqube 2>/dev/null || true + claude mcp add sonarqube \ + --env STORAGE_PATH="$STORAGE_PATH" \ + --env SONARQUBE_TOKEN="$SONARQUBE_TOKEN" \ + --env SONARQUBE_URL="$SONARQUBE_URL" \ + -- java -jar /opt/mcp/sonarqube-mcp.jar + + # Jenkins MCP + claude mcp remove jenkins 2>/dev/null || true + if [ -n "$JENKINS_USERNAME" ] && [ -n "$JENKINS_TOKEN" ]; then + JENKINS_AUTH=$(echo -n "$JENKINS_USERNAME:$JENKINS_TOKEN" | base64) + claude mcp add jenkins "$JENKINS_URL/mcp-server/mcp" \ + --transport http \ + --header "Authorization: Basic $JENKINS_AUTH" + fi + + # GitHub CLI auth + if [ -n "$GH_TOKEN" ]; then + echo "$GH_TOKEN" | gh auth login --with-token + gh auth status + fi + + - id: build + exec: + component: tools + workingDir: /projects/brit + commandLine: cargo build -p brit-verify -p brit-cli -p brit-epr + + - id: test + exec: + component: tools + workingDir: /projects/brit + commandLine: cargo test --workspace + +# Disabled until the workspace boots cleanly once — re-enable after first success. +# events: +# postStart: +# - setup-vscode-cli +# - setup-claude-mcp diff --git a/docs/composition.md b/docs/composition.md new file mode 100644 index 00000000000..9b9d20d95c5 --- /dev/null +++ b/docs/composition.md @@ -0,0 +1,147 @@ +# Composition Model + +How brit composes with the protocol schemas, rust-ipfs, rakia, and elohim-storage. + +## The Four Sources + +Brit sits at the intersection of four systems. It consumes protocol schemas and rust-ipfs from below, and provides git primitives upward to rakia and the broader protocol. + +``` + ┌─────────────────────────────────┐ + │ Protocol Schemas │ + │ elohim/sdk/schemas/v1/ │ + │ │ + │ ContentNode types │ + │ Reach enum (8 levels) │ + │ Pillar vocabulary │ + │ Attestation format │ + │ Trailer key grammar │ + └──────────┬──────────────────────┘ + │ defines vocabulary + v + ┌─────────────────────────────────────────────────────┐ + │ Brit │ + │ (covenant on git) │ + │ │ + │ Engine (brit-epr): trailer parse, validate, │ + │ schema dispatch, CID utilities, signing hooks │ + │ │ + │ App schema (elohim-protocol): pillar trailers, │ + │ ContentNode catalog, signal taxonomy, │ + │ reach-per-ref, merge consent │ + └──────┬──────────────────────┬──────────────────────┘ + │ │ + │ provides git │ provides CID storage + │ primitives │ and transport + v v + ┌──────────────────┐ ┌──────────────────────────┐ + │ Rakia │ │ rust-ipfs │ + │ (firmament) │ │ (storage/transport │ + │ │ │ substrate) │ + │ Change detection │ │ │ + │ Baseline refs │ │ CIDs, multihash │ + │ Manifest CIDs │ │ Bitswap, libp2p │ + │ Attestation │ │ DAG-CBOR serialization │ + └──────────────────┘ └──────────────────────────┘ + ^ ^ + │ │ + ┌──────────┴──────────────────────┴──────────────────┐ + │ elohim-storage / steward infrastructure │ + │ │ + │ libp2p swarm + discovery │ + │ ContentNode storage API │ + │ DHT publication │ + │ Doorway gateway (web2 bridge) │ + └─────────────────────────────────────────────────────┘ +``` + +## Rules of Composition + +### 1. The engine knows nothing about the protocol + +brit-epr's engine layer parses RFC-822 trailers, validates structure, dispatches semantic checks to a loaded `AppSchema`, and provides CID utilities. It does not know the words lamad, shefa, or qahal. The Elohim Protocol vocabulary lives behind `#[cfg(feature = "elohim-protocol")]`. Someone could disable the feature, implement `AcmeSchema: AppSchema` for carbon accounting, and use the same engine without touching brit's source. + +### 2. Brit doesn't own governance + +The merge consent critique (2026-04-11) established: brit reads consent requirements from the parent EPR's governance primitives. It does not implement governance logic. A `MergeProposalContentNode` carries a TTL and frozen requirements; the actual consent accumulation happens in the governance gateway (elohim-storage). Brit publishes the proposal, waits for the governance surface to respond, and acts on the result. + +This principle extends to every reach-change operation. Brit doesn't decide "can this ref move to reach=public?" — the protocol's qahal layer decides. Brit asks and executes. + +### 3. Brit talks to the network through rust-ipfs + +Content-addressed storage and retrieval use rust-ipfs's blockstore and Bitswap. Brit doesn't implement its own block storage or P2P transfer. The CID of a ContentNode is computed by rust-ipfs's DAG-CBOR serializer. Brit wraps the semantic layer (commits, refs, branches) over rust-ipfs's storage layer (CIDs, blocks, Bitswap). + +### 4. The doorway is the web2 bridge + +Stock git hosting (GitHub, GitLab) carries commits with trailers — the protocol surface. The linked ContentNodes, attestation graphs, reach governance, and per-branch READMEs resolve through the doorway. `.brit/doorway.toml` points the repo at its primary steward's gateway. Without a doorway, a brit repo degrades gracefully to "git with extra trailer discipline." + +### 5. Schema changes flow from the protocol + +``` +Protocol Schema changes + -> regenerate brit's Rust types (brit-epr-elohim) + -> regenerate rakia's Rust types (rakia-core) + -> regenerate TypeScript types (storage-client) +``` + +Brit vendors the protocol schemas at `schemas/elohim-protocol/v1/`. The authoritative copy lives in `elohim/sdk/schemas/v1/`. Updating is an explicit act. + +## Submodule Topology + +``` +brit/ (ethosengine/brit — fork of gitoxide) + schemas/ + elohim-protocol/v1/ (vendored protocol schemas) + elohim/ + rust-ipfs/ (submodule -> ethosengine/rust-ipfs, future) +``` + +In the monorepo: +``` +elohim/ + brit/ (submodule -> ethosengine/brit) + rakia/ (submodule -> ethosengine/rakia) + elohim/brit/ (rakia's own submodule ref to brit) + rust-ipfs/ (submodule -> ethosengine/rust-ipfs) + sdk/schemas/v1/ (authoritative protocol schemas) +``` + +## How Brit Serves Rakia + +Rakia never talks to git directly — it talks to brit. This means rakia automatically benefits from brit's protocol enrichment without reimplementing any of it. + +| What rakia needs | What brit provides | Brit phase | +|---|---|---| +| Changed file paths since baseline | gix diff (object store, no shell-out) | Phase 0+1 (current) | +| Baseline ref management | notes-ref API (`refs/notes/rakia/baselines`) | Phase 0+1 | +| Attestation format for build outputs | Pillar trailers + `Built-By:` reserved key | Phase 0+1 | +| Build manifest as ContentNode | `BuildManifestContentNode` via ContentNode adapter | Phase 2 | +| Build attestation as ContentNode | `BuildAttestationContentNode` via ContentNode adapter | Phase 2 | +| Manifest distribution over P2P | libp2p fetch protocol (`/brit/fetch/1.0.0`) | Phase 3 | +| Build status per branch | Per-branch READMEs resolving to ContentNodes | Phase 4 | +| Manifest discovery via DHT | DHT announcement of repo + manifest CIDs | Phase 5 | +| Forked build recipes | ForkContentNode with independent stewardship | Phase 6 | + +## How Brit Serves the Protocol + +Beyond rakia, brit provides foundational infrastructure for the entire protocol: + +| Protocol need | What brit provides | +|---|---| +| Provenance-aware code | Every commit carries pillar trailers: who built it, what value it creates, who governs it | +| Content-addressed repositories | Repo, commit, tree, blob all addressed by CID | +| Governance-aware merging | MergeProposalContentNode with async consent from qahal layer | +| Fork legitimacy | ForkContentNode — a new covenant with its own stewardship, not a second-class copy | +| Reach-governed visibility | Branches carry reach levels; merging IS reach elevation | +| Schema extensibility | AppSchema trait allows domain-specific vocabularies without forking brit | +| Stock git compatibility | Every brit repo is a valid git repo; `git clone` from any forge works | + +## The Reach Bridge (Shared with Rakia) + +Reach is the concept that bridges brit and rakia most directly: + +**In brit:** Reach is per-ref. A branch at `reach=trusted` means its content is visible to trusted peers. Merging to main is reach-elevation from `trusted` to `public`. + +**In rakia:** Reach is per-artifact. A build at `reach=self` is a local build. At `reach=trusted` it's CI-verified. At `reach=community` it's staging-verified. At `reach=public` it's production-deployed. + +**The bridge:** Both are reach-elevation proposals. Both require attestation accumulation. Both flow through the protocol's governance surface. The governance system doesn't distinguish "code review approval" from "build verification" — both are witnessed claims accumulating toward a reach threshold. This is why brit and rakia share an attestation format. diff --git a/docs/plans/2026-04-11-phase-0-epr-trailer-foundation.md b/docs/plans/2026-04-11-phase-0-epr-trailer-foundation.md new file mode 100644 index 00000000000..e290fdac938 --- /dev/null +++ b/docs/plans/2026-04-11-phase-0-epr-trailer-foundation.md @@ -0,0 +1,1479 @@ +# Phase 0+1: EPR Trailer Foundation Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Scaffold the `brit-epr` crate with an engine/app-schema split and ship the first working EPR primitive — a pillar-trailer parser/validator — such that any git commit carrying `Lamad:`, `Shefa:`, `Qahal:` trailers can be verified by a `brit-verify` binary. + +**Architecture:** Single new crate `brit-epr` lives in the gitoxide workspace. It has two internal modules: an **unconditional `engine` module** (trailer parser, generic validator, `AppSchema` dispatch trait, CID types) and a **feature-gated `elohim` module** (the concrete `ElohimProtocolSchema` implementation, pillar trailer types, signal catalog constants). Default features include `elohim-protocol`. Zero modifications to existing `gix-*` crates — pure additive scaffolding. The binary `brit-verify` demonstrates end-to-end use: given a commit SHA in a local repo, parse its trailers, extract pillar fields, and exit 0 if all three pillars are present and well-formed. + +**Schema source of truth:** `docs/schemas/elohim-protocol-manifest.md` is the normative reference for trailer shapes, validation rules, and the `AppSchema` trait signature. This plan links to it rather than duplicating spec text. When this plan and the schema doc disagree, the schema doc wins and this plan gets a follow-up edit. + +**Tech stack:** Rust 2021, gitoxide's existing `gix-object` crate (for `BodyRef::trailers()` which parses RFC-822-style trailers), `thiserror` for error types. No `winnow`, `clap`, `serde_json`, or `cid` crate dependencies yet — those come in Phase 2+. + +**Upstream compatibility:** Every commit a brit user creates round-trips through stock git. Pillar trailers follow RFC-822 "Key: value" syntax, indistinguishable from `Signed-off-by:` to any reader that doesn't know about them. + +## Scope — what's IN Phase 1 + +- `brit-epr` crate in the workspace with the engine-vs-schema boundary established from Task 0. +- The unconditional `engine` module with: + - `AppSchema` trait (the dispatch contract from schema doc §2.3). + - `TrailerSet` — ordered, duplicate-aware map of `(key, value)` pairs preserving roundtrip order. + - `TrailerBlock` parser — given a commit body, locate and extract the trailer block via `gix_object::commit::message::BodyRef::trailers()`. + - `ValidationError` and engine error types. +- The feature-gated `elohim` module (behind `#[cfg(feature = "elohim-protocol")]`, default on) with: + - `ElohimProtocolSchema` — the `AppSchema` implementor. + - `PillarTrailers` — strongly-typed view over the six trailer keys (three canonical summaries, three linked-node CID slots). + - `parse_pillar_trailers(body)` convenience function. + - `validate_pillar_trailers(&PillarTrailers)` convenience function. +- `brit-verify` binary that opens a repo, resolves a commit rev, runs the elohim schema's parser + validator, and exits 0/1. +- Fixtures: happy-path commit body, missing-pillar body, malformed-node-ref body. +- Submodule pointer bump in the parent `elohim` monorepo. + +## Scope — what's deliberately OUT of Phase 1 + +These are excluded not because they're unimportant but because they live in later phases and including them now would bloat the first commit cycle: + +- **`MergeProposalContentNode`** and the async merge consent flow. Out. That's Phase 2+ and requires the parent-EPR governance adapter. See schema doc §5.13 and §14.1 #4. +- **Reach awareness** in `BranchContentNode`. Out. Phase 1 parses commit trailers only; branches come in Phase 2. The vendored `schemas/elohim-protocol/v1/enums/reach.schema.json` is already in the tree for future use. +- **ContentNode adapter** — no code that registers brit repos as ContentNodes in any external store. Out. +- **libp2p transport** (`/brit/fetch/1.0.0`). Out — Phase 3. +- **CID resolution** — the parser recognizes `Lamad-Node:`, `Shefa-Node:`, `Qahal-Node:` as CID-bearing trailer keys and stores the raw string, but does NOT parse it into a typed `Cid`, does NOT resolve it, does NOT check the target type. Phase 2. +- **JSON Schema codegen pipeline**. Phase 1 hand-writes types in Rust. Phase 2 introduces the codegen from `schemas/elohim-protocol/v1/*.schema.json` files when the ContentNode adapter work starts. +- **Signals emitted** (§9 catalog). Phase 1 doesn't emit any — it only parses and validates. Phase 2+ adds signal emission once there's something to emit them to. +- **`brit-cli` full binary**. Phase 1 ships only the minimum `brit-verify` example binary. The full brit subcommand surface from schema doc §3 is Phase 3+. + +## File structure + +``` +brit/ +├── brit-epr/ +│ ├── Cargo.toml # new crate, member of workspace +│ ├── src/ +│ │ ├── lib.rs # crate root, re-exports, feature-gated pub use +│ │ ├── engine/ +│ │ │ ├── mod.rs # module exports +│ │ │ ├── app_schema.rs # AppSchema trait (the dispatch contract) +│ │ │ ├── trailer_set.rs # TrailerSet type +│ │ │ ├── trailer_block.rs # TrailerBlock parser — wraps gix-object +│ │ │ └── error.rs # ValidationError, EngineError +│ │ └── elohim/ +│ │ ├── mod.rs # #[cfg(feature = "elohim-protocol")] +│ │ ├── schema.rs # ElohimProtocolSchema (impl AppSchema) +│ │ ├── pillar_trailers.rs # PillarTrailers strong type, TrailerKey enum +│ │ ├── parse.rs # parse_pillar_trailers +│ │ └── validate.rs # validate_pillar_trailers +│ └── tests/ +│ ├── engine_parsing.rs # engine-level trailer block extraction +│ ├── elohim_parse.rs # pillar trailer parsing (gated on feature) +│ ├── elohim_validate.rs # pillar validation (gated on feature) +│ └── fixtures/ +│ ├── happy_all_three_pillars.txt +│ ├── missing_qahal.txt +│ └── malformed_shefa_node.txt +├── brit-verify/ +│ ├── Cargo.toml # new binary crate +│ └── src/ +│ └── main.rs # CLI: brit-verify [--repo ] +└── Cargo.toml # modified: add workspace members +``` + +**Responsibilities per file:** + +- `brit-epr/src/engine/app_schema.rs` — the `AppSchema` trait. Engine only knows the contract, never which schema is plugged in. +- `brit-epr/src/engine/trailer_set.rs` — `TrailerSet` is a `Vec<(String, String)>`-backed structure preserving insertion order, with `get`, `get_all` (for repeatable keys), `iter`, and `Display` producing the canonical RFC-822 representation. +- `brit-epr/src/engine/trailer_block.rs` — one public function `parse_trailer_block(body: &[u8]) -> TrailerSet`. Uses `gix_object::commit::message::BodyRef::from_bytes` and `.trailers()`. No schema-specific knowledge. +- `brit-epr/src/engine/error.rs` — `EngineError`, `ValidationError` via `thiserror`. +- `brit-epr/src/elohim/schema.rs` — `ElohimProtocolSchema` zero-sized struct implementing `AppSchema`. The implementation names the six trailer keys, declares required keys, routes validation to the pair/set checkers. +- `brit-epr/src/elohim/pillar_trailers.rs` — `PillarTrailers` struct with `lamad/shefa/qahal` summary fields and `lamad_node/shefa_node/qahal_node` raw-CID-string fields. `TrailerKey` enum with `summary_token()` and `node_token()` accessors. +- `brit-epr/src/elohim/parse.rs` — `parse_pillar_trailers(body: &[u8]) -> PillarTrailers` convenience function that calls the engine's trailer-block parser and projects into the typed view. +- `brit-epr/src/elohim/validate.rs` — `validate_pillar_trailers(&PillarTrailers) -> Result<(), PillarValidationError>`. Structural validation only: all three summary trailers present and non-empty. No CID resolution, no cross-referential checks. +- `brit-epr/src/lib.rs` — re-exports `engine::*` unconditionally; re-exports `elohim::*` behind `#[cfg(feature = "elohim-protocol")]`. +- `brit-verify/src/main.rs` — minimal CLI (`std::env::args`, no clap), opens repo via `gix::discover`, reads commit, projects to body, calls `elohim::parse_pillar_trailers` + `elohim::validate_pillar_trailers`, prints summary + exits 0/1. + +--- + +## Task 0: Scaffolding — add `brit-epr` crate with engine/elohim split + +**Files:** +- Create: `brit-epr/Cargo.toml` +- Create: `brit-epr/src/lib.rs` +- Create: `brit-epr/src/engine/mod.rs` +- Create: `brit-epr/src/elohim/mod.rs` +- Modify: `Cargo.toml` (root — add to workspace members) + +- [ ] **Step 0.1: Create the crate manifest** + +Create `brit-epr/Cargo.toml`: + +```toml +lints.workspace = true + +[package] +name = "brit-epr" +version = "0.0.0" +description = "Elohim Protocol primitives (pillar trailers, dispatch trait, validation) for brit — an expansion of gitoxide with covenant semantics" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[lib] +doctest = false + +[features] +default = ["elohim-protocol"] +# Gates the elohim module — brit's first-party app schema implementation. +# With this feature off, brit-epr is the covenant engine alone: trailer +# parsing, the AppSchema dispatch trait, error types. No pillar-specific +# behavior. A downstream fork can disable this feature and ship their own +# app schema crate. +elohim-protocol = [] + +[dependencies] +gix-object = { version = "^0.52.0", path = "../gix-object" } +thiserror = "2.0" +``` + +> **Note:** Version `^0.52.0` is illustrative. Read the actual value in `gix-object/Cargo.toml` in this workspace checkout and use the current major.minor. + +- [ ] **Step 0.2: Create the lib.rs crate root** + +Create `brit-epr/src/lib.rs`: + +```rust +//! Elohim Protocol primitives for brit. +//! +//! `brit-epr` has two layers: +//! +//! - **`engine`** — unconditional. The covenant engine: trailer parser, +//! `AppSchema` dispatch trait, `TrailerSet`, validation errors. Does not know +//! which schema is plugged in. A downstream fork can disable the default +//! feature and ship its own app schema on this engine. +//! - **`elohim`** — feature-gated behind `elohim-protocol` (default on). The +//! first-party Elohim Protocol app schema: pillar trailer types (Lamad, +//! Shefa, Qahal), the concrete `ElohimProtocolSchema` implementor, parse +//! and validate convenience functions. +//! +//! The normative specification for the trailer format, pillar meanings, and +//! validation rules lives in `docs/schemas/elohim-protocol-manifest.md` at +//! the root of the brit repository. When this crate and the schema doc +//! disagree, the schema doc wins. + +#![deny(missing_docs, rust_2018_idioms)] +#![forbid(unsafe_code)] + +pub mod engine; + +#[cfg(feature = "elohim-protocol")] +pub mod elohim; + +// Unconditional re-exports +pub use engine::{AppSchema, TrailerSet, ValidationError}; + +// Feature-gated re-exports +#[cfg(feature = "elohim-protocol")] +pub use elohim::{ + parse_pillar_trailers, validate_pillar_trailers, ElohimProtocolSchema, PillarTrailers, + PillarValidationError, TrailerKey, +}; +``` + +- [ ] **Step 0.3: Create the engine module stub** + +Create `brit-epr/src/engine/mod.rs`: + +```rust +//! Covenant engine — unconditional layer that knows the trailer format and +//! dispatch contract but not any specific schema vocabulary. + +mod app_schema; +mod error; +mod trailer_block; +mod trailer_set; + +pub use app_schema::AppSchema; +pub use error::{EngineError, ValidationError}; +pub use trailer_block::parse_trailer_block; +pub use trailer_set::TrailerSet; +``` + +- [ ] **Step 0.4: Create the elohim module stub** + +Create `brit-epr/src/elohim/mod.rs`: + +```rust +//! Elohim Protocol app schema — first-party `AppSchema` implementation. +//! +//! Gated behind `#[cfg(feature = "elohim-protocol")]`. With this feature +//! disabled, `brit-epr` ships only the engine. + +mod parse; +mod pillar_trailers; +mod schema; +mod validate; + +pub use parse::parse_pillar_trailers; +pub use pillar_trailers::{PillarTrailers, TrailerKey}; +pub use schema::ElohimProtocolSchema; +pub use validate::{validate_pillar_trailers, PillarValidationError}; +``` + +- [ ] **Step 0.5: Add to workspace members** + +Edit root `Cargo.toml`. Find the `members = [` list and add `"brit-epr"` as the last entry before the closing `]`: + +```toml +# ... existing gix-* members ... + "gix-shallow", + "brit-epr", +] +``` + +- [ ] **Step 0.6: Verify workspace builds refuse to compile with missing modules** + +Run: + +``` +cargo build -p brit-epr +``` + +Expected: compile error. The module files referenced in Steps 0.3 and 0.4 don't exist yet (`app_schema.rs`, `error.rs`, etc.). This is expected — Task 1 creates them. If the build somehow passes, go back and verify the `mod` declarations in Steps 0.3 / 0.4 are present. + +- [ ] **Step 0.7: Commit** + +``` +git add brit-epr/Cargo.toml brit-epr/src/lib.rs brit-epr/src/engine/mod.rs brit-epr/src/elohim/mod.rs Cargo.toml +git commit -m "feat(brit-epr): scaffold crate with engine/elohim feature split + +Establishes the engine-vs-app-schema boundary from day 0. The engine +module is unconditional; the elohim module is gated behind the +elohim-protocol cargo feature (default on). Subsequent tasks land the +trait, types, parser, and validator." +``` + +--- + +## Task 1: Engine — define `AppSchema` trait, `TrailerSet`, error types + +**Files:** +- Create: `brit-epr/src/engine/error.rs` +- Create: `brit-epr/src/engine/app_schema.rs` +- Create: `brit-epr/src/engine/trailer_set.rs` + +- [ ] **Step 1.1: Create `engine/error.rs`** + +Create `brit-epr/src/engine/error.rs`: + +```rust +//! Engine-level error types. + +use thiserror::Error; + +/// Errors raised by the covenant engine's generic layer. +#[derive(Debug, Error)] +pub enum EngineError { + /// Unable to extract a trailer block from a commit body. + #[error("failed to parse trailer block: {0}")] + TrailerBlockParse(String), +} + +/// Errors emitted by schema validation. App schemas return this type from +/// `AppSchema::validate_pair` and `AppSchema::validate_set`. +/// +/// Variants are intentionally broad because different app schemas will +/// express different failure modes. A richer error type can layer on top. +#[derive(Debug, Error, PartialEq, Eq)] +pub enum ValidationError { + /// A required trailer key was absent from the set. + #[error("required trailer key missing: {0}")] + MissingKey(String), + + /// A trailer value is present but empty or whitespace-only. + #[error("trailer key {0} has empty value")] + EmptyValue(String), + + /// A trailer value failed a format check (e.g., malformed CID). + #[error("trailer key {0} malformed: {1}")] + MalformedValue(String, String), + + /// Cross-field rule violated. + #[error("trailer set failed cross-field rule: {0}")] + CrossFieldRule(String), +} +``` + +- [ ] **Step 1.2: Create `engine/trailer_set.rs`** + +Create `brit-epr/src/engine/trailer_set.rs`: + +```rust +//! `TrailerSet` — ordered, duplicate-aware key/value pairs from a commit +//! trailer block. Preserves insertion order for roundtrip-compatible +//! rendering. + +use std::fmt; + +/// A commit trailer block, parsed into ordered key/value pairs. +/// +/// Order is preserved because the engine must be able to re-render the +/// trailer block byte-identically for signing and round-trip use cases. +/// Duplicate keys are allowed (e.g., multiple `Signed-off-by:` or +/// repeatable app-schema keys like `Built-By:`). +#[derive(Debug, Clone, Default, PartialEq, Eq)] +pub struct TrailerSet { + entries: Vec<(String, String)>, +} + +impl TrailerSet { + /// Create an empty set. + pub fn new() -> Self { + Self { entries: Vec::new() } + } + + /// Append a trailer entry, preserving insertion order. + pub fn push(&mut self, key: impl Into, value: impl Into) { + self.entries.push((key.into(), value.into())); + } + + /// Return the first value for a given key, or `None` if absent. + pub fn get(&self, key: &str) -> Option<&str> { + self.entries + .iter() + .find(|(k, _)| k == key) + .map(|(_, v)| v.as_str()) + } + + /// Return all values for a given key (preserves order). + pub fn get_all(&self, key: &str) -> Vec<&str> { + self.entries + .iter() + .filter(|(k, _)| k == key) + .map(|(_, v)| v.as_str()) + .collect() + } + + /// Iterate over all `(key, value)` pairs in insertion order. + pub fn iter(&self) -> impl Iterator { + self.entries.iter().map(|(k, v)| (k.as_str(), v.as_str())) + } + + /// Number of entries. + pub fn len(&self) -> usize { + self.entries.len() + } + + /// True when there are no entries. + pub fn is_empty(&self) -> bool { + self.entries.is_empty() + } +} + +impl fmt::Display for TrailerSet { + fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { + for (k, v) in &self.entries { + writeln!(f, "{k}: {v}")?; + } + Ok(()) + } +} +``` + +- [ ] **Step 1.3: Create `engine/app_schema.rs`** + +Create `brit-epr/src/engine/app_schema.rs`: + +```rust +//! `AppSchema` — the dispatch contract between the covenant engine and +//! specific app schemas (e.g., `elohim-protocol`). +//! +//! The normative specification is in `docs/schemas/elohim-protocol-manifest.md` +//! §2.3. This file is the Rust projection of that contract. + +use crate::engine::{TrailerSet, ValidationError}; + +/// Dispatch contract that app schemas implement. +/// +/// The engine consumes an `impl AppSchema` to do validation and rendering +/// without knowing the specific vocabulary (Lamad / Shefa / Qahal, or any +/// other app's keys). This is what keeps the engine/app-schema boundary +/// legible — see `elohim-protocol-manifest.md` §11.7 for boundary smells +/// that indicate the boundary is drifting. +pub trait AppSchema { + /// Stable identifier for this schema, e.g. `"elohim-protocol/1.0.0"`. + fn id(&self) -> &'static str; + + /// Does this schema recognize this trailer key? + fn owns_key(&self, key: &str) -> bool; + + /// Required keys. Engine uses this to short-circuit validation when the + /// commit message is missing the required surface entirely. + fn required_keys(&self) -> &'static [&'static str]; + + /// Which keys carry CID references? The resolver walks these in later + /// phases. Phase 1 just records the list. + fn cid_bearing_keys(&self) -> &'static [&'static str]; + + /// Validate one `(key, value)` pair in isolation (no cross-field rules). + fn validate_pair(&self, key: &str, value: &str) -> Result<(), ValidationError>; + + /// Validate the whole trailer set together (cross-field rules, e.g. + /// "`Lamad-Node:` present requires `Lamad:` non-empty"). + fn validate_set(&self, trailers: &TrailerSet) -> Result<(), ValidationError>; +} +``` + +- [ ] **Step 1.4: Build-check the engine module** + +Run: + +``` +cargo build -p brit-epr --no-default-features +``` + +Expected: compiles with warnings (unused imports on `trailer_block` which is stubbed but not yet created). If the build fails with "file not found for module trailer_block", stub it now by creating `brit-epr/src/engine/trailer_block.rs` containing: + +```rust +//! Stubbed; Task 2 implements this. +use crate::engine::TrailerSet; + +/// Parse a commit body into a `TrailerSet`. Stub — Task 2 replaces this. +pub fn parse_trailer_block(_body: &[u8]) -> TrailerSet { + TrailerSet::new() +} +``` + +Then retry the build. The `--no-default-features` flag proves the engine compiles without the elohim module — this is the boundary check. + +- [ ] **Step 1.5: Commit** + +``` +git add brit-epr/src/engine/ +git commit -m "feat(brit-epr/engine): add AppSchema trait, TrailerSet, errors + +Engine layer is now independently compilable with --no-default-features. +Proves the engine/app-schema boundary holds from day 0: the engine +knows nothing about Lamad/Shefa/Qahal specifically." +``` + +--- + +## Task 2: Engine — implement `parse_trailer_block` using `gix-object` + +**Files:** +- Modify: `brit-epr/src/engine/trailer_block.rs` (replace stub) +- Create: `brit-epr/tests/engine_parsing.rs` + +- [ ] **Step 2.1: Write the failing engine-level test** + +Create `brit-epr/tests/engine_parsing.rs`: + +```rust +//! Engine-level tests — trailer block extraction, no app-schema semantics. + +use brit_epr::engine::{parse_trailer_block, TrailerSet}; + +#[test] +fn extracts_trailer_block_from_commit_body() { + let body = b"\ +Add pillar trailer parser + +Wires gix-object into the covenant engine so trailer blocks can be +extracted into a schema-agnostic TrailerSet. + +Signed-off-by: Matthew Dowell +Lamad: introduces pillar trailer model +Shefa: stewardship by @matthew +Qahal: no governance review required +"; + + let trailers: TrailerSet = parse_trailer_block(body); + + assert_eq!(trailers.len(), 4, "expected 4 trailers, got {}", trailers.len()); + assert_eq!(trailers.get("Signed-off-by"), Some("Matthew Dowell ")); + assert_eq!(trailers.get("Lamad"), Some("introduces pillar trailer model")); + assert_eq!(trailers.get("Shefa"), Some("stewardship by @matthew")); + assert_eq!(trailers.get("Qahal"), Some("no governance review required")); +} + +#[test] +fn empty_trailer_block_returns_empty_set() { + let body = b"Commit with no trailers at all, just a body."; + let trailers = parse_trailer_block(body); + assert_eq!(trailers.len(), 0); +} +``` + +- [ ] **Step 2.2: Run the tests — expect failure** + +Run: + +``` +cargo test -p brit-epr --test engine_parsing +``` + +Expected: one or both tests fail because the stub from Step 1.4 returns an empty `TrailerSet`. If you see "cannot find function `parse_trailer_block`", you may have forgotten to `pub use` it from `engine/mod.rs` — check Task 0 Step 0.3. + +- [ ] **Step 2.3: Implement the real parser** + +Replace `brit-epr/src/engine/trailer_block.rs` with: + +```rust +//! `parse_trailer_block` — extract a commit's RFC-822-style trailer block +//! into a `TrailerSet`. Wraps `gix_object::commit::message::BodyRef::trailers()`. + +use gix_object::commit::message::BodyRef; + +use crate::engine::TrailerSet; + +/// Parse a commit body's bytes into a `TrailerSet`. +/// +/// The body is the message *after* the commit headers (author, committer, +/// tree, parent lines) — i.e., what gitoxide calls "the body" of a commit. +/// This function extracts the final trailing block of `Key: value` lines +/// (if any) and records each as an entry in a `TrailerSet`, preserving +/// insertion order. +/// +/// Returns an empty `TrailerSet` if the body has no trailer block. +pub fn parse_trailer_block(body: &[u8]) -> TrailerSet { + let body_ref = BodyRef::from_bytes(body); + let mut set = TrailerSet::new(); + + for trailer in body_ref.trailers() { + // BStr → String via to_str_lossy.into_owned. Safe because commit + // messages are conventionally UTF-8 and the lossy conversion + // preserves whatever bytes we got. + let key = trailer.token.to_str_lossy().into_owned(); + let value = trailer.value.to_str_lossy().into_owned(); + set.push(key, value); + } + + set +} +``` + +- [ ] **Step 2.4: Run the tests — expect pass** + +Run: + +``` +cargo test -p brit-epr --test engine_parsing +``` + +Expected: both tests pass. If `to_str_lossy` doesn't exist, the correct method on `bstr::BStr` in the vendored gitoxide version may be `to_str_lossy().to_string()` or similar — grep `gix-object/src/commit/message/body.rs` for `to_str_lossy` or `to_string` usage to confirm the idiom used in this workspace. + +- [ ] **Step 2.5: Commit** + +``` +git add brit-epr/src/engine/trailer_block.rs brit-epr/tests/engine_parsing.rs +git commit -m "feat(brit-epr/engine): implement parse_trailer_block via gix-object + +Wraps gix_object::commit::message::BodyRef::trailers() into a +schema-agnostic TrailerSet. Engine-level tests prove extraction +works for happy path and no-trailers case." +``` + +--- + +## Task 3: Elohim — `PillarTrailers`, `TrailerKey`, `ElohimProtocolSchema` + +**Files:** +- Create: `brit-epr/src/elohim/pillar_trailers.rs` +- Create: `brit-epr/src/elohim/schema.rs` + +- [ ] **Step 3.1: Create `pillar_trailers.rs`** + +Create `brit-epr/src/elohim/pillar_trailers.rs`: + +```rust +//! Pillar trailer types — the strongly-typed view the elohim app schema +//! uses to represent the three pillars plus their linked-node CID slots. + +/// Which of the three pillars a trailer belongs to. +/// +/// The elohim protocol pillars: +/// +/// - **Lamad** (לָמַד, "to learn") — knowledge positioning. +/// - **Shefa** (שֶׁפַע, "abundance") — economic positioning. +/// - **Qahal** (קָהָל, "assembly") — governance positioning. +/// +/// Each pillar has two trailer forms: a canonical summary (e.g., `Lamad:`) +/// and a linked-node CID reference (e.g., `Lamad-Node:`). +#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)] +pub enum TrailerKey { + /// Knowledge-layer trailer. + Lamad, + /// Economic-layer trailer. + Shefa, + /// Governance-layer trailer. + Qahal, +} + +impl TrailerKey { + /// The RFC-822 token name for the canonical-summary trailer. + pub fn summary_token(self) -> &'static str { + match self { + TrailerKey::Lamad => "Lamad", + TrailerKey::Shefa => "Shefa", + TrailerKey::Qahal => "Qahal", + } + } + + /// The RFC-822 token name for the linked-node CID trailer. + pub fn node_token(self) -> &'static str { + match self { + TrailerKey::Lamad => "Lamad-Node", + TrailerKey::Shefa => "Shefa-Node", + TrailerKey::Qahal => "Qahal-Node", + } + } + + /// All three pillars, in canonical order. + pub fn all() -> [TrailerKey; 3] { + [TrailerKey::Lamad, TrailerKey::Shefa, TrailerKey::Qahal] + } +} + +/// Pillar trailers extracted from a commit body and projected into the +/// typed view the elohim app schema uses. +/// +/// Each `*_node` field holds the raw CID *string* — Phase 1 does not parse +/// the string into a typed `Cid`, does not resolve it, and does not check +/// the target's type. The parser is permissive; strict CID validation and +/// resolution arrive in Phase 2. +#[derive(Debug, Clone, Default, PartialEq, Eq)] +pub struct PillarTrailers { + /// Canonical summary value of the `Lamad:` trailer, trimmed. + pub lamad: Option, + /// Canonical summary value of the `Shefa:` trailer, trimmed. + pub shefa: Option, + /// Canonical summary value of the `Qahal:` trailer, trimmed. + pub qahal: Option, + + /// Raw CID string from a `Lamad-Node:` trailer, if present. Phase 1 + /// does not parse or resolve this. + pub lamad_node: Option, + /// Raw CID string from a `Shefa-Node:` trailer, if present. + pub shefa_node: Option, + /// Raw CID string from a `Qahal-Node:` trailer, if present. + pub qahal_node: Option, +} +``` + +- [ ] **Step 3.2: Create `schema.rs`** + +Create `brit-epr/src/elohim/schema.rs`: + +```rust +//! `ElohimProtocolSchema` — the first-party `AppSchema` implementation. + +use crate::elohim::pillar_trailers::TrailerKey; +use crate::engine::{AppSchema, TrailerSet, ValidationError}; + +/// Zero-sized implementor of [`AppSchema`] for the Elohim Protocol. +/// +/// Instances are stateless. Typically you construct one like +/// `const SCHEMA: ElohimProtocolSchema = ElohimProtocolSchema;` and pass +/// by reference. +#[derive(Debug, Clone, Copy, Default)] +pub struct ElohimProtocolSchema; + +const SUMMARY_KEYS: &[&str] = &["Lamad", "Shefa", "Qahal"]; +const NODE_KEYS: &[&str] = &["Lamad-Node", "Shefa-Node", "Qahal-Node"]; + +impl AppSchema for ElohimProtocolSchema { + fn id(&self) -> &'static str { + "elohim-protocol/1.0.0" + } + + fn owns_key(&self, key: &str) -> bool { + SUMMARY_KEYS.contains(&key) || NODE_KEYS.contains(&key) + } + + fn required_keys(&self) -> &'static [&'static str] { + SUMMARY_KEYS + } + + fn cid_bearing_keys(&self) -> &'static [&'static str] { + NODE_KEYS + } + + fn validate_pair(&self, key: &str, value: &str) -> Result<(), ValidationError> { + if !self.owns_key(key) { + return Ok(()); // not our key; ignore + } + if value.trim().is_empty() { + return Err(ValidationError::EmptyValue(key.to_string())); + } + // Phase 1: no additional format checks. Phase 2 adds CID parsing on + // NODE_KEYS. + Ok(()) + } + + fn validate_set(&self, trailers: &TrailerSet) -> Result<(), ValidationError> { + // Check required keys are present in canonical order so the error + // always names Lamad before Shefa before Qahal. + for key in TrailerKey::all() { + let summary = key.summary_token(); + match trailers.get(summary) { + None => return Err(ValidationError::MissingKey(summary.to_string())), + Some(v) if v.trim().is_empty() => { + return Err(ValidationError::EmptyValue(summary.to_string())) + } + Some(_) => {} + } + } + Ok(()) + } +} +``` + +- [ ] **Step 3.3: Build-check** + +Run: + +``` +cargo build -p brit-epr +``` + +Expected: compiles with default features (the elohim module is enabled). If `parse` or `validate` module files are missing, create stub files: + +```rust +// brit-epr/src/elohim/parse.rs +//! Stubbed; Task 4 implements. +use super::PillarTrailers; +pub fn parse_pillar_trailers(_body: &[u8]) -> PillarTrailers { + PillarTrailers::default() +} +``` + +```rust +// brit-epr/src/elohim/validate.rs +//! Stubbed; Task 5 implements. +use super::PillarTrailers; +use thiserror::Error; + +#[derive(Debug, Error, PartialEq, Eq)] +pub enum PillarValidationError { + #[error("stub")] + Stub, +} + +pub fn validate_pillar_trailers(_trailers: &PillarTrailers) -> Result<(), PillarValidationError> { + Ok(()) +} +``` + +Retry build. Pass. + +- [ ] **Step 3.4: Commit** + +``` +git add brit-epr/src/elohim/ +git commit -m "feat(brit-epr/elohim): add PillarTrailers, TrailerKey, schema impl + +ElohimProtocolSchema implements AppSchema with closed vocabulary +(Lamad/Shefa/Qahal summary keys + their -Node CID counterparts). +Phase 1 stores raw CID strings without parsing — CID resolution +arrives in Phase 2." +``` + +--- + +## Task 4: Elohim — `parse_pillar_trailers` (TDD) + +**Files:** +- Modify: `brit-epr/src/elohim/parse.rs` (replace stub) +- Create: `brit-epr/tests/elohim_parse.rs` +- Create: `brit-epr/tests/fixtures/happy_all_three_pillars.txt` +- Create: `brit-epr/tests/fixtures/missing_qahal.txt` +- Create: `brit-epr/tests/fixtures/malformed_shefa_node.txt` + +- [ ] **Step 4.1: Write the first failing test — happy path** + +Create `brit-epr/tests/fixtures/happy_all_three_pillars.txt`: + +``` +Add pillar trailer parser + +Wires gix-object::BodyRef::trailers() into the brit-epr engine so +commit messages can carry Lamad / Shefa / Qahal values natively. + +Signed-off-by: Matthew Dowell +Lamad: introduces pillar trailer model; first testable EPR primitive +Shefa: stewardship by @matthew; contributor credit via git author +Qahal: no governance review required for scaffolding +``` + +Create `brit-epr/tests/elohim_parse.rs`: + +```rust +//! Integration tests for elohim pillar trailer parsing. + +use brit_epr::{parse_pillar_trailers, PillarTrailers}; + +fn fixture(name: &str) -> Vec { + let path = format!("tests/fixtures/{}", name); + std::fs::read(&path).unwrap_or_else(|e| panic!("failed to read fixture {path}: {e}")) +} + +#[test] +fn happy_path_all_three_pillars_parse() { + let body = fixture("happy_all_three_pillars.txt"); + let trailers: PillarTrailers = parse_pillar_trailers(&body); + + assert_eq!( + trailers.lamad.as_deref(), + Some("introduces pillar trailer model; first testable EPR primitive") + ); + assert_eq!( + trailers.shefa.as_deref(), + Some("stewardship by @matthew; contributor credit via git author") + ); + assert_eq!( + trailers.qahal.as_deref(), + Some("no governance review required for scaffolding") + ); + assert_eq!(trailers.lamad_node, None); + assert_eq!(trailers.shefa_node, None); + assert_eq!(trailers.qahal_node, None); +} +``` + +- [ ] **Step 4.2: Run the test — expect failure** + +Run: + +``` +cargo test -p brit-epr --test elohim_parse happy_path_all_three_pillars_parse +``` + +Expected: fails because the stub from Task 3 returns `PillarTrailers::default()`. All assertions about `lamad/shefa/qahal` having `Some(...)` values fail. + +- [ ] **Step 4.3: Implement the parser** + +Replace `brit-epr/src/elohim/parse.rs` with: + +```rust +//! `parse_pillar_trailers` — convenience function that projects a +//! `TrailerSet` into the strongly-typed `PillarTrailers` view. + +use crate::elohim::pillar_trailers::{PillarTrailers, TrailerKey}; +use crate::engine::parse_trailer_block; + +/// Parse pillar trailers from a commit body. +/// +/// Pure function: no I/O beyond reading the body slice. Unknown trailers +/// (anything outside the six reserved pillar keys) are silently skipped — +/// a commit may carry `Signed-off-by:`, `Co-Authored-By:`, etc., alongside +/// the pillar trailers. +/// +/// Permissive: malformed values in `*_Node:` trailers are accepted as raw +/// strings. Strict validation is done by `validate_pillar_trailers`. +pub fn parse_pillar_trailers(body: &[u8]) -> PillarTrailers { + let set = parse_trailer_block(body); + let mut out = PillarTrailers::default(); + + for (key, value) in set.iter() { + for pillar in TrailerKey::all() { + if key == pillar.summary_token() { + match pillar { + TrailerKey::Lamad => out.lamad = Some(value.to_string()), + TrailerKey::Shefa => out.shefa = Some(value.to_string()), + TrailerKey::Qahal => out.qahal = Some(value.to_string()), + } + } else if key == pillar.node_token() { + match pillar { + TrailerKey::Lamad => out.lamad_node = Some(value.to_string()), + TrailerKey::Shefa => out.shefa_node = Some(value.to_string()), + TrailerKey::Qahal => out.qahal_node = Some(value.to_string()), + } + } + } + } + + out +} +``` + +- [ ] **Step 4.4: Run the test — expect pass** + +Run: + +``` +cargo test -p brit-epr --test elohim_parse happy_path_all_three_pillars_parse +``` + +Expected: pass. + +- [ ] **Step 4.5: Add the partial-pillars test** + +Create `brit-epr/tests/fixtures/missing_qahal.txt`: + +``` +Routine refactor with only two pillars declared + +Lamad: no knowledge change — pure refactor +Shefa: no value flow — maintenance work +``` + +Append to `brit-epr/tests/elohim_parse.rs`: + +```rust +#[test] +fn missing_qahal_parses_partially() { + let body = fixture("missing_qahal.txt"); + let trailers = parse_pillar_trailers(&body); + + assert_eq!(trailers.lamad.as_deref(), Some("no knowledge change — pure refactor")); + assert_eq!(trailers.shefa.as_deref(), Some("no value flow — maintenance work")); + assert_eq!(trailers.qahal, None); +} +``` + +- [ ] **Step 4.6: Add the malformed-node test** + +Create `brit-epr/tests/fixtures/malformed_shefa_node.txt`: + +``` +Test permissive parser behavior for malformed node ref + +Lamad: teaches the permissive parser behavior +Shefa: value summary is fine +Shefa-Node: not-a-valid-cid-at-all +Qahal: governance review complete +``` + +Append to `brit-epr/tests/elohim_parse.rs`: + +```rust +#[test] +fn malformed_shefa_node_stored_as_raw_string() { + let body = fixture("malformed_shefa_node.txt"); + let trailers = parse_pillar_trailers(&body); + + assert_eq!(trailers.lamad.as_deref(), Some("teaches the permissive parser behavior")); + assert_eq!(trailers.shefa.as_deref(), Some("value summary is fine")); + assert_eq!(trailers.qahal.as_deref(), Some("governance review complete")); + + // Phase 1 is permissive — stores raw string without parsing. + // Phase 2 will add typed CID parsing and reject malformed values. + assert_eq!(trailers.shefa_node.as_deref(), Some("not-a-valid-cid-at-all")); +} +``` + +- [ ] **Step 4.7: Run all elohim_parse tests** + +Run: + +``` +cargo test -p brit-epr --test elohim_parse +``` + +Expected: 3 tests pass. + +- [ ] **Step 4.8: Commit** + +``` +git add brit-epr/src/elohim/parse.rs brit-epr/tests/elohim_parse.rs brit-epr/tests/fixtures/ +git commit -m "feat(brit-epr/elohim): implement parse_pillar_trailers + +Projects engine's schema-agnostic TrailerSet into the typed +PillarTrailers view. Permissive: unknown trailers skipped, malformed +node refs stored as raw strings. Three fixtures cover happy path, +partial declaration, and malformed node-ref." +``` + +--- + +## Task 5: Elohim — `validate_pillar_trailers` (TDD) + +**Files:** +- Modify: `brit-epr/src/elohim/validate.rs` (replace stub) +- Create: `brit-epr/tests/elohim_validate.rs` + +- [ ] **Step 5.1: Write the first failing test** + +Create `brit-epr/tests/elohim_validate.rs`: + +```rust +//! Integration tests for elohim pillar structural validation. + +use brit_epr::{validate_pillar_trailers, PillarTrailers, PillarValidationError, TrailerKey}; + +fn complete() -> PillarTrailers { + PillarTrailers { + lamad: Some("knowledge summary".into()), + shefa: Some("economic summary".into()), + qahal: Some("governance summary".into()), + lamad_node: None, + shefa_node: None, + qahal_node: None, + } +} + +#[test] +fn all_three_present_validates_ok() { + assert_eq!(validate_pillar_trailers(&complete()), Ok(())); +} + +#[test] +fn missing_lamad_fails_with_missing_key() { + let mut t = complete(); + t.lamad = None; + assert_eq!( + validate_pillar_trailers(&t), + Err(PillarValidationError::MissingPillar(TrailerKey::Lamad)) + ); +} + +#[test] +fn empty_shefa_fails_with_empty_value() { + let mut t = complete(); + t.shefa = Some(" ".into()); + assert_eq!( + validate_pillar_trailers(&t), + Err(PillarValidationError::EmptyPillar(TrailerKey::Shefa)) + ); +} + +#[test] +fn returns_first_error_in_canonical_order() { + let t = PillarTrailers { + lamad: None, + shefa: Some("ok".into()), + qahal: None, + ..Default::default() + }; + assert_eq!( + validate_pillar_trailers(&t), + Err(PillarValidationError::MissingPillar(TrailerKey::Lamad)) + ); +} +``` + +- [ ] **Step 5.2: Run — expect failure** + +Run: + +``` +cargo test -p brit-epr --test elohim_validate +``` + +Expected: compilation errors for `PillarValidationError::MissingPillar` and `::EmptyPillar` because the stub used `Stub`. + +- [ ] **Step 5.3: Implement the validator** + +Replace `brit-epr/src/elohim/validate.rs` with: + +```rust +//! Structural validation for pillar trailers. +//! +//! Checks that each pillar has a non-empty summary value. Does NOT resolve +//! linked-node CIDs, does NOT traverse the ContentNode graph, does NOT +//! enforce domain rules — those live in higher layers (Phase 2+). + +use thiserror::Error; + +use crate::elohim::pillar_trailers::{PillarTrailers, TrailerKey}; + +/// Structural validation errors. +#[derive(Debug, Error, PartialEq, Eq)] +pub enum PillarValidationError { + /// Required pillar summary trailer is missing. + #[error("required pillar trailer missing: {0:?}")] + MissingPillar(TrailerKey), + + /// Pillar summary trailer is present but empty after trimming. + #[error("pillar trailer {0:?} is present but value is empty")] + EmptyPillar(TrailerKey), +} + +/// Structurally validate a `PillarTrailers` view. +/// +/// Returns `Ok(())` if all three summary trailers are present and non-empty. +/// Returns the first error in canonical order (Lamad → Shefa → Qahal). +/// +/// Linked-node CID strings are ignored by this validator — Phase 1 does +/// not enforce their format or resolvability. +pub fn validate_pillar_trailers(t: &PillarTrailers) -> Result<(), PillarValidationError> { + for pillar in TrailerKey::all() { + let summary = match pillar { + TrailerKey::Lamad => t.lamad.as_deref(), + TrailerKey::Shefa => t.shefa.as_deref(), + TrailerKey::Qahal => t.qahal.as_deref(), + }; + match summary { + None => return Err(PillarValidationError::MissingPillar(pillar)), + Some(v) if v.trim().is_empty() => { + return Err(PillarValidationError::EmptyPillar(pillar)) + } + Some(_) => {} + } + } + Ok(()) +} +``` + +- [ ] **Step 5.4: Run all tests** + +Run: + +``` +cargo test -p brit-epr +``` + +Expected: all tests pass (engine_parsing: 2, elohim_parse: 3, elohim_validate: 4 → 9 total). + +- [ ] **Step 5.5: Verify engine-only build still works** + +Run: + +``` +cargo build -p brit-epr --no-default-features +``` + +Expected: compiles. This proves the engine/app-schema boundary still holds after all the elohim code landed. + +- [ ] **Step 5.6: Commit** + +``` +git add brit-epr/src/elohim/validate.rs brit-epr/tests/elohim_validate.rs +git commit -m "feat(brit-epr/elohim): add structural pillar validator + +validate_pillar_trailers enforces all three pillar summary trailers +are present and non-empty. Errors in canonical order Lamad → Shefa +→ Qahal. No CID resolution, no graph traversal — those are Phase 2." +``` + +--- + +## Task 6: Build the `brit-verify` CLI + +**Files:** +- Create: `brit-verify/Cargo.toml` +- Create: `brit-verify/src/main.rs` +- Modify: `Cargo.toml` (root — add `"brit-verify"` to workspace members) + +- [ ] **Step 6.1: Create the binary manifest** + +Create `brit-verify/Cargo.toml`: + +```toml +lints.workspace = true + +[package] +name = "brit-verify" +version = "0.0.0" +description = "Verify pillar trailers on a git commit — the first brit binary" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[[bin]] +name = "brit-verify" +path = "src/main.rs" + +[dependencies] +brit-epr = { version = "^0.0.0", path = "../brit-epr" } +gix = { version = "^0.74.0", path = "../gix", default-features = false, features = ["revision"] } +``` + +> **Note:** The `gix` version and feature flags are illustrative. Read `gix/Cargo.toml` in this workspace for the actual current version. Try the smallest feature set that lets you open a repo and read a commit by rev (`revision` is probably enough). If cargo complains about a missing method in Step 6.3, enlarge the feature set. + +- [ ] **Step 6.2: Add to workspace members** + +Edit root `Cargo.toml`: + +```toml + "brit-epr", + "brit-verify", +] +``` + +- [ ] **Step 6.3: Implement the binary** + +Create `brit-verify/src/main.rs`: + +```rust +//! `brit-verify` — verify pillar trailers on a git commit. +//! +//! Usage: `brit-verify [--repo ]` +//! +//! Opens the repository at `` (current directory if omitted), resolves +//! `` to a commit object, extracts the commit message body, +//! parses pillar trailers with brit-epr, runs structural validation, and +//! prints the result. Exits 0 on success, 1 on validation failure, 2 on +//! usage error, 3 on repo error. +//! +//! No clap, no tracing — smallest possible end-to-end proof that parser +//! and validator work against real git objects. + +use std::process::ExitCode; + +use brit_epr::{parse_pillar_trailers, validate_pillar_trailers}; + +fn main() -> ExitCode { + let args: Vec = std::env::args().collect(); + + let (rev, repo_path) = match parse_args(&args) { + Ok(parsed) => parsed, + Err(msg) => { + eprintln!("{msg}\n\nUsage: brit-verify [--repo ]"); + return ExitCode::from(2); + } + }; + + let repo = match gix::discover(&repo_path) { + Ok(repo) => repo, + Err(e) => { + eprintln!("failed to open repo at {repo_path}: {e}"); + return ExitCode::from(3); + } + }; + + let commit = match repo.rev_parse_single(rev.as_str()) { + Ok(id) => match id.object() { + Ok(obj) => match obj.try_into_commit() { + Ok(c) => c, + Err(_) => { + eprintln!("rev {rev} does not point at a commit"); + return ExitCode::from(3); + } + }, + Err(e) => { + eprintln!("failed to load object for {rev}: {e}"); + return ExitCode::from(3); + } + }, + Err(e) => { + eprintln!("failed to resolve rev {rev}: {e}"); + return ExitCode::from(3); + } + }; + + let decoded = match commit.decode() { + Ok(c) => c, + Err(e) => { + eprintln!("failed to decode commit {rev}: {e}"); + return ExitCode::from(3); + } + }; + + // decoded.message is the full message including trailing trailers. + let trailers = parse_pillar_trailers(decoded.message); + + match validate_pillar_trailers(&trailers) { + Ok(()) => { + println!("✓ pillar trailers valid for {rev}"); + println!(" Lamad: {}", trailers.lamad.as_deref().unwrap_or("-")); + println!(" Shefa: {}", trailers.shefa.as_deref().unwrap_or("-")); + println!(" Qahal: {}", trailers.qahal.as_deref().unwrap_or("-")); + if let Some(ref c) = trailers.lamad_node { + println!(" Lamad-Node: {c}"); + } + if let Some(ref c) = trailers.shefa_node { + println!(" Shefa-Node: {c}"); + } + if let Some(ref c) = trailers.qahal_node { + println!(" Qahal-Node: {c}"); + } + ExitCode::SUCCESS + } + Err(e) => { + eprintln!("✗ pillar validation failed for {rev}: {e}"); + ExitCode::FAILURE + } + } +} + +fn parse_args(args: &[String]) -> Result<(String, String), String> { + if args.len() < 2 { + return Err("missing argument".into()); + } + let rev = args[1].clone(); + let mut repo_path = ".".to_string(); + + let mut i = 2; + while i < args.len() { + match args[i].as_str() { + "--repo" => { + i += 1; + if i >= args.len() { + return Err("--repo requires a path argument".into()); + } + repo_path = args[i].clone(); + i += 1; + } + unknown => return Err(format!("unknown argument: {unknown}")), + } + } + Ok((rev, repo_path)) +} +``` + +> **Note:** The `gix` API surface (`discover`, `rev_parse_single`, `object`, `try_into_commit`, `decode`) is stable in recent gitoxide but names may shift. If cargo complains, `rg -n 'pub fn discover' ../gix/src/` and `rg -n 'rev_parse_single' ../gix/src/` to find the current signatures in this workspace's checkout. Swap method names as needed. The core shape (open repo → resolve rev → decode commit → get message → call parser + validator) is stable even if names drift. The `decode().message` field contains the full commit message including the trailer block — pass it directly to `parse_pillar_trailers`. + +- [ ] **Step 6.4: Build the binary** + +Run: + +``` +cargo build -p brit-verify +``` + +Expected: compiles. If API mismatches, follow the note above. + +- [ ] **Step 6.5: End-to-end smoke test (manual)** + +In the brit submodule workspace, create a scratch commit carrying pillar trailers: + +``` +git -c user.email=test@example.com -c user.name=test \ + commit --allow-empty -m "$(cat <<'EOF' +brit-verify smoke test + +Lamad: smoke-test message for brit-verify integration +Shefa: zero-value scratch commit, no stewardship impact +Qahal: self-reviewed, scaffolding only +EOF +)" + +SMOKE_SHA=$(git rev-parse HEAD) +cargo run -p brit-verify -- $SMOKE_SHA +``` + +Expected output (approximately): + +``` +✓ pillar trailers valid for + Lamad: smoke-test message for brit-verify integration + Shefa: zero-value scratch commit, no stewardship impact + Qahal: self-reviewed, scaffolding only +``` + +Then verify negative case: + +``` +cargo run -p brit-verify -- HEAD~5 +``` + +Expected: an upstream gitoxide commit fails with something like `✗ pillar validation failed for HEAD~5: required pillar trailer missing: Lamad` and exits non-zero. + +- [ ] **Step 6.6: Roll back the smoke-test commit** + +``` +git reset --soft HEAD~1 +git status --short +``` + +Verify only the brit-verify files are staged (`brit-verify/Cargo.toml`, `brit-verify/src/main.rs`, root `Cargo.toml`). Nothing else. + +- [ ] **Step 6.7: Commit the binary** + +``` +git add brit-verify/Cargo.toml brit-verify/src/main.rs Cargo.toml +git commit -m "feat(brit-verify): first brit binary — pillar trailer verifier + +Opens a repo, resolves a rev, parses pillar trailers via brit-epr, +runs structural validation, exits 0/1/2/3. No clap, no tracing — +smallest possible end-to-end proof that the engine + elohim schema +work against real git objects." +``` + +--- + +## Task 7: Bump the submodule pointer in the parent monorepo + +**Files:** +- Modify: `/projects/elohim/` (parent monorepo — bumps the brit submodule SHA) + +- [ ] **Step 7.1: Switch to parent monorepo** + +``` +cd /projects/elohim +``` + +- [ ] **Step 7.2: Verify the submodule pointer advanced** + +``` +git status elohim/brit +``` + +Expected: `modified: elohim/brit (new commits)`. + +- [ ] **Step 7.3: Stage and commit the pointer bump** + +``` +git add elohim/brit +git commit -m "chore(brit): bump submodule to Phase 0+1 trailer foundation + +Advances the brit submodule pointer to the commit range that adds +the brit-epr crate (engine + elohim feature module) and the +brit-verify binary. See elohim/brit/docs/plans/2026-04-11-phase-0- +epr-trailer-foundation.md for the implementation plan and +elohim/brit/docs/schemas/elohim-protocol-manifest.md for the schema." +``` + +- [ ] **Step 7.4: Push-dry-run the parent monorepo** + +``` +git push --dry-run +``` + +Expected: pre-push runs, reports what would be pushed. Do NOT actually push — leave that for the user to confirm. + +- [ ] **Step 7.5: Report back** + +Report to the user: + +``` +Phase 0+1 complete. Summary: + + - brit-epr crate scaffolded with engine + elohim feature split. + - Engine: AppSchema trait, TrailerSet, parse_trailer_block via gix-object. + - Elohim: PillarTrailers, ElohimProtocolSchema, parse_pillar_trailers, + validate_pillar_trailers. + - 9 tests passing (engine_parsing: 2, elohim_parse: 3, elohim_validate: 4). + - --no-default-features build verified — engine compiles without elohim. + - brit-verify binary builds, smoke-tested end-to-end against a real commit. + - Submodule pointer bumped in parent monorepo. + +Ready to push both repos. Waiting for confirmation. +``` + +--- + +## Self-Review + +**Spec coverage:** +- ✅ Engine/app-schema split from schema doc §2.3 and §11.1 (Task 0, Task 1) +- ✅ `AppSchema` trait matching the pseudocode in §2.3 (Task 1) +- ✅ Engine parses trailer blocks without knowing the vocabulary (Task 2) +- ✅ Elohim feature module implements `AppSchema` with closed vocabulary (Task 3) +- ✅ `parse_pillar_trailers` + `validate_pillar_trailers` (Tasks 4, 5) +- ✅ `--no-default-features` compile check (Task 5 Step 5.5) +- ✅ End-to-end CLI binary with real git objects (Task 6) +- ✅ Submodule pointer bump (Task 7) +- ✅ Merge consent explicitly OUT of scope (header + scope section) +- ✅ Reach awareness explicitly OUT of scope (header + scope section) +- ✅ CID parsing explicitly OUT of scope — raw strings only (Task 3 Step 3.1, Task 4 Step 4.6) + +**Placeholder scan:** None. Every step has the actual code or command. API drift notes are explicit about what to grep for when names shift. + +**Type consistency:** `TrailerKey` is used identically across `pillar_trailers.rs`, `schema.rs`, `parse.rs`, `validate.rs`, and the tests. `PillarTrailers` fields (`lamad/shefa/qahal/*_node`) are used identically in parser and validator. `ValidationError` vs `PillarValidationError`: the engine has a broad `ValidationError`; the elohim module has a narrower `PillarValidationError` that reports errors in terms of `TrailerKey`. This is intentional — each layer speaks its own vocabulary. + +**What this plan does NOT cover (deferred to later phases):** +- CID parsing / resolution / graph traversal → Phase 2 +- ContentNode adapter → Phase 2 +- `MergeProposalContentNode` + async merge consent → Phase 2 (co-resolves with §14.1 #12) +- Reach awareness on branches → Phase 2 +- libp2p transport → Phase 3 +- Full `brit-cli` with subcommands → Phase 3+ +- Signal emission → Phase 2+ +- JSON Schema codegen pipeline → Phase 2+ +- Per-branch READMEs → Phase 4 +- DHT announcement → Phase 5 +- Fork-as-governance → Phase 6 diff --git a/docs/plans/2026-04-16-phase-2a-build-attestation-primitives.md b/docs/plans/2026-04-16-phase-2a-build-attestation-primitives.md new file mode 100644 index 00000000000..4b6eff4c673 --- /dev/null +++ b/docs/plans/2026-04-16-phase-2a-build-attestation-primitives.md @@ -0,0 +1,2793 @@ +# Phase 2a: Build Attestation Primitives Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Add three attestation ContentNode types (build, deploy, validation) to `brit-epr`, a local object store under `.git/brit/objects/`, git ref management under `refs/notes/brit/`, and a `brit build-ref` CLI — all pure-local, no DHT, no P2P. + +**Architecture:** Extends `brit-epr`'s feature-gated `elohim` module with attestation schemas (serde-serializable structs), a minimal `ContentNode` trait for CID-addressed local storage, and a ref-to-CID index backed by git notes refs. A new `brit-build-ref` binary crate provides the CLI. Agent signing uses `ed25519-dalek` with a file-based key at `.git/brit/agent-key`. Everything behind the existing `elohim-protocol` feature flag. + +**Tech Stack:** Rust 2021, serde + serde_json (serialization), blake3 (content hashing), ed25519-dalek (signing), clap 4 (CLI), gix (repo + ref access). All already in the workspace lockfile except blake3 and ed25519-dalek. + +**Design spec:** `docs/plans/phases/phase-2a-build-attestation-primitives.md` — when this plan and the spec disagree, the spec wins. + +**Dependency note:** The Phase 2 ContentNode adapter (RepoContentNode, CommitContentNode, etc.) has not been designed yet. This plan introduces the *minimal* ContentNode foundation needed for attestations — a trait, a CID type, and a local store. The full Phase 2 adapter will extend this foundation when it lands. + +--- + +## File structure + +``` +brit-epr/ +├── Cargo.toml # modified: add serde, serde_json, blake3, ed25519-dalek +├── src/ +│ ├── lib.rs # modified: re-export new types +│ ├─�� engine/ +│ │ ├─�� mod.rs # modified: export new modules +│ │ ├── content_node.rs # NEW: ContentNode trait +│ │ ├── cid.rs # NEW: BritCid type (blake3-based) +│ │ ├── object_store.rs # NEW: .git/brit/objects/ local store +│ │ └── signing.rs # NEW: ed25519 agent signing +│ └── elohim/ +│ ├── mod.rs # modified: export attestation modules +│ ├── attestation/ +│ │ ├─�� mod.rs # NEW: module root +│ │ ├── build.rs # NEW: BuildAttestationContentNode +│ │ ├── deploy.rs # NEW: DeployAttestationContentNode +│ │ ├��─ validation.rs # NEW: ValidationAttestationContentNode +│ │ └── reach.rs # NEW: reach computation from attestations +│ └── refs.rs # NEW: refs/notes/brit/ management +├── tests/ +│ ├── attestation_roundtrip.rs # NEW: serde roundtrip for all three types +│ ├── object_store.rs # NEW: local store put/get/list +│ ├── ref_management.rs # NEW: ref read/write/list +│ └── reach_computation.rs # NEW: deterministic reach derivation +brit-build-ref/ +├── Cargo.toml # NEW: binary crate +└── src/ + ├── main.rs # NEW: clap entrypoint + ├���─ build_cmd.rs # NEW: build put/get/list + ├── deploy_cmd.rs # NEW: deploy put/get/list + ├── validate_cmd.rs # NEW: validate put/get/list + └── reach_cmd.rs # NEW: reach compute/get +``` + +**Responsibilities per file:** + +- `engine/content_node.rs` — `ContentNode` trait: `content_type() -> &str`, serialization to canonical JSON, CID derivation. Engine-level, no pillar knowledge. +- `engine/cid.rs` — `BritCid` wrapper around a blake3 hash. `Display` as hex. `FromStr` for parsing. `compute(bytes) -> BritCid`. +- `engine/object_store.rs` �� `LocalObjectStore` reads/writes JSON files to `.git/brit/objects/{cid}`. Pure filesystem, no git objects. +- `engine/signing.rs` — `AgentKey` loads/generates ed25519 keypair from `.git/brit/agent-key`. `sign(payload) -> Signature`. `verify(payload, signature, pubkey) -> bool`. +- `elohim/attestation/build.rs` — `BuildAttestationContentNode` struct with all fields from the Phase 2a spec. +- `elohim/attestation/deploy.rs` — `DeployAttestationContentNode` struct. +- `elohim/attestation/validation.rs` — `ValidationAttestationContentNode` struct with check vocabulary enforcement. +- `elohim/attestation/reach.rs` — `compute_reach(step, store, refs) -> ReachLevel` derives reach from existing attestations. +- `elohim/refs.rs` — `BritRefManager` wraps gix ref operations for `refs/notes/brit/{build,deploy,validate,reach}/*`. + +--- + +## Task 0: Add dependencies to brit-epr + +**Files:** +- Modify: `brit-epr/Cargo.toml` + +- [ ] **Step 0.1: Add serde, serde_json, blake3, ed25519-dalek, chrono** + +Edit `brit-epr/Cargo.toml`, add to `[dependencies]`: + +```toml +[dependencies] +gix-object = { version = "^0.58.0", path = "../gix-object", features = ["sha1"] } +thiserror = "2.0" +serde = { version = "1", features = ["derive"] } +serde_json = "1" +blake3 = "1" +ed25519-dalek = { version = "2", features = ["rand_core", "pkcs8"] } +rand = "0.8" +chrono = { version = "0.4", features = ["serde"], default-features = false } +``` + +> **Note:** `chrono` is for ISO-8601 timestamps. `rand` is for key generation. `ed25519-dalek` 2.x uses `rand_core` internally; the `rand_core` feature re-exports it. Check the actual latest compatible versions in crates.io if cargo complains. + +- [ ] **Step 0.2: Verify it compiles** + +Run: + +``` +cargo build -p brit-epr +``` + +Expected: compiles. New deps are unused — that's fine, warnings expected. + +- [ ] **Step 0.3: Verify engine-only build still works** + +Run: + +``` +cargo build -p brit-epr --no-default-features +``` + +Expected: compiles. The engine deps (serde, blake3, etc.) are unconditional — they're needed for the ContentNode trait and CID type which live in the engine. + +- [ ] **Step 0.4: Commit** + +``` +git add brit-epr/Cargo.toml Cargo.lock +git commit -m "chore(brit-epr): add serde, blake3, ed25519-dalek, chrono deps + +Preparation for Phase 2a attestation primitives. These deps are +unconditional (engine-level) because ContentNode trait, CID, signing, +and timestamps are engine concerns, not schema-specific." +``` + +--- + +## Task 1: Engine — BritCid type (blake3-based content addressing) + +**Files:** +- Create: `brit-epr/src/engine/cid.rs` +- Modify: `brit-epr/src/engine/mod.rs` + +- [ ] **Step 1.1: Create `engine/cid.rs`** + +Create `brit-epr/src/engine/cid.rs`: + +```rust +//! `BritCid` — content identifier based on BLAKE3 hashing. +//! +//! Phase 2a uses a simplified CID: the BLAKE3 hash of the canonical JSON +//! serialization of a ContentNode. Full multiformats CIDv1 comes in a later +//! phase when interop with IPFS/Holochain requires it. + +use std::fmt; +use std::str::FromStr; + +use serde::{Deserialize, Serialize}; + +/// A content identifier — the BLAKE3 hash of a content payload. +/// +/// Displayed and parsed as a 64-character lowercase hex string. +#[derive(Debug, Clone, PartialEq, Eq, Hash, Serialize, Deserialize)] +#[serde(transparent)] +pub struct BritCid(String); + +impl BritCid { + /// Compute a CID from arbitrary bytes. + pub fn compute(data: &[u8]) -> Self { + let hash = blake3::hash(data); + Self(hash.to_hex().to_string()) + } + + /// Return the hex string representation. + pub fn as_str(&self) -> &str { + &self.0 + } +} + +impl fmt::Display for BritCid { + fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { + f.write_str(&self.0) + } +} + +impl FromStr for BritCid { + type Err = CidParseError; + + fn from_str(s: &str) -> Result { + if s.len() != 64 { + return Err(CidParseError::InvalidLength(s.len())); + } + if !s.chars().all(|c| c.is_ascii_hexdigit()) { + return Err(CidParseError::InvalidHex); + } + Ok(Self(s.to_lowercase())) + } +} + +/// Errors when parsing a CID string. +#[derive(Debug, thiserror::Error, PartialEq, Eq)] +pub enum CidParseError { + /// Expected 64 hex characters. + #[error("expected 64 hex characters, got {0}")] + InvalidLength(usize), + /// Non-hex character found. + #[error("CID contains non-hex characters")] + InvalidHex, +} + +#[cfg(test)] +mod tests { + use super::*; + + #[test] + fn compute_is_deterministic() { + let a = BritCid::compute(b"hello world"); + let b = BritCid::compute(b"hello world"); + assert_eq!(a, b); + } + + #[test] + fn different_input_different_cid() { + let a = BritCid::compute(b"hello"); + let b = BritCid::compute(b"world"); + assert_ne!(a, b); + } + + #[test] + fn roundtrip_display_parse() { + let cid = BritCid::compute(b"test data"); + let parsed: BritCid = cid.to_string().parse().unwrap(); + assert_eq!(cid, parsed); + } + + #[test] + fn rejects_short_string() { + let result = "abc123".parse::(); + assert_eq!(result, Err(CidParseError::InvalidLength(6))); + } + + #[test] + fn serde_roundtrip() { + let cid = BritCid::compute(b"serde test"); + let json = serde_json::to_string(&cid).unwrap(); + let back: BritCid = serde_json::from_str(&json).unwrap(); + assert_eq!(cid, back); + } +} +``` + +- [ ] **Step 1.2: Export from engine/mod.rs** + +Edit `brit-epr/src/engine/mod.rs` — add `cid` module and re-export: + +```rust +//! Covenant engine — unconditional layer that knows the trailer format and +//! dispatch contract but not any specific schema vocabulary. + +mod app_schema; +pub mod cid; +mod error; +mod trailer_block; +mod trailer_set; + +pub use app_schema::AppSchema; +pub use cid::{BritCid, CidParseError}; +pub use error::{EngineError, ValidationError}; +pub use trailer_block::parse_trailer_block; +pub use trailer_set::TrailerSet; +``` + +- [ ] **Step 1.3: Add re-export to lib.rs** + +Edit `brit-epr/src/lib.rs` — add to the unconditional re-exports: + +```rust +// Unconditional re-exports +pub use engine::{AppSchema, BritCid, CidParseError, TrailerSet, ValidationError}; +``` + +- [ ] **Step 1.4: Run tests** + +Run: + +``` +cargo test -p brit-epr -- cid +``` + +Expected: 5 unit tests pass. + +- [ ] **Step 1.5: Commit** + +``` +git add brit-epr/src/engine/cid.rs brit-epr/src/engine/mod.rs brit-epr/src/lib.rs +git commit -m "feat(brit-epr/engine): add BritCid type with blake3 hashing + +Content identifiers are BLAKE3 hashes displayed as 64-char hex. +Deterministic, serde-serializable, FromStr-parseable. Full multiformats +CIDv1 deferred to the phase that needs IPFS/Holochain interop." +``` + +--- + +## Task 2: Engine — ContentNode trait and LocalObjectStore + +**Files:** +- Create: `brit-epr/src/engine/content_node.rs` +- Create: `brit-epr/src/engine/object_store.rs` +- Modify: `brit-epr/src/engine/mod.rs` +- Create: `brit-epr/tests/object_store.rs` + +- [ ] **Step 2.1: Write the failing test for object store** + +Create `brit-epr/tests/object_store.rs`: + +```rust +//! Integration tests for LocalObjectStore. + +use brit_epr::engine::cid::BritCid; +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use serde::{Deserialize, Serialize}; +use tempfile::TempDir; + +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)] +struct TestNode { + name: String, + value: u32, +} + +impl ContentNode for TestNode { + fn content_type(&self) -> &'static str { + "test.node" + } +} + +#[test] +fn put_then_get_roundtrips() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + + let node = TestNode { + name: "hello".into(), + value: 42, + }; + + let cid = store.put(&node).unwrap(); + let back: TestNode = store.get(&cid).unwrap(); + + assert_eq!(node, back); +} + +#[test] +fn same_content_same_cid() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + + let node = TestNode { + name: "deterministic".into(), + value: 7, + }; + + let cid1 = store.put(&node).unwrap(); + let cid2 = store.put(&node).unwrap(); + assert_eq!(cid1, cid2); +} + +#[test] +fn get_missing_cid_returns_error() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + let fake_cid = BritCid::compute(b"does not exist"); + + let result = store.get::(&fake_cid); + assert!(result.is_err()); +} + +#[test] +fn list_returns_all_stored_cids() { + let tmp = TempDir::new().unwrap(); + let store = LocalObjectStore::new(tmp.path().join("objects")); + + let a = store + .put(&TestNode { + name: "a".into(), + value: 1, + }) + .unwrap(); + let b = store + .put(&TestNode { + name: "b".into(), + value: 2, + }) + .unwrap(); + + let mut cids = store.list().unwrap(); + cids.sort_by(|x, y| x.as_str().cmp(y.as_str())); + + let mut expected = vec![a, b]; + expected.sort_by(|x, y| x.as_str().cmp(y.as_str())); + + assert_eq!(cids, expected); +} +``` + +- [ ] **Step 2.2: Run tests — expect compile failure** + +Run: + +``` +cargo test -p brit-epr --test object_store +``` + +Expected: compile errors — `content_node` and `object_store` modules don't exist yet. + +- [ ] **Step 2.3: Create `engine/content_node.rs`** + +Create `brit-epr/src/engine/content_node.rs`: + +```rust +//! `ContentNode` — trait for CID-addressed content objects stored locally. +//! +//! This is the minimal foundation Phase 2a needs. The full Phase 2 +//! ContentNode adapter (RepoContentNode, CommitContentNode, etc.) will +//! extend this trait with pillar fields and relationship methods. + +use serde::{de::DeserializeOwned, Serialize}; + +use crate::engine::cid::BritCid; + +/// A content-addressed node that can be serialized to canonical JSON and +/// stored in the local object store. +/// +/// Implementors must be `Serialize + DeserializeOwned`. The CID is +/// computed from the canonical JSON representation (keys sorted, +/// no trailing whitespace). +pub trait ContentNode: Serialize + DeserializeOwned { + /// The content type discriminator, e.g. `"brit.build-attestation"`. + fn content_type(&self) -> &'static str; + + /// Serialize to canonical JSON bytes. + /// + /// Default implementation uses serde_json with sorted keys. + fn canonical_json(&self) -> Result, serde_json::Error> { + // serde_json serializes struct fields in declaration order, which + // is deterministic for a given struct definition. For canonical + // ordering across potential future schema evolution, we round-trip + // through serde_json::Value to get sorted keys. + let value = serde_json::to_value(self)?; + serde_json::to_vec(&value) + } + + /// Compute the content identifier from the canonical JSON representation. + fn compute_cid(&self) -> Result { + let bytes = self.canonical_json()?; + Ok(BritCid::compute(&bytes)) + } +} +``` + +- [ ] **Step 2.4: Create `engine/object_store.rs`** + +Create `brit-epr/src/engine/object_store.rs`: + +```rust +//! `LocalObjectStore` — stores ContentNodes as JSON files under +//! `.git/brit/objects/`, addressed by their BritCid. + +use std::fs; +use std::path::PathBuf; + +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; + +/// Filesystem-backed content-addressed store. +/// +/// Objects live at `{base_dir}/{cid}` as canonical JSON. The store +/// creates the directory on first write if it doesn't exist. +pub struct LocalObjectStore { + base_dir: PathBuf, +} + +impl LocalObjectStore { + /// Create a store rooted at the given directory. + pub fn new(base_dir: PathBuf) -> Self { + Self { base_dir } + } + + /// Create a store for a git repo by locating `.git/brit/objects/`. + pub fn for_git_dir(git_dir: &std::path::Path) -> Self { + Self::new(git_dir.join("brit").join("objects")) + } + + /// Store a ContentNode. Returns its CID. + /// + /// Idempotent: storing the same content twice produces the same CID + /// and overwrites with identical bytes. + pub fn put(&self, node: &T) -> Result { + let json = node.canonical_json().map_err(ObjectStoreError::Serialize)?; + let cid = BritCid::compute(&json); + + fs::create_dir_all(&self.base_dir).map_err(ObjectStoreError::Io)?; + + let path = self.base_dir.join(cid.as_str()); + fs::write(&path, &json).map_err(ObjectStoreError::Io)?; + + Ok(cid) + } + + /// Retrieve a ContentNode by CID. + pub fn get(&self, cid: &BritCid) -> Result { + let path = self.base_dir.join(cid.as_str()); + let bytes = fs::read(&path).map_err(|e| { + if e.kind() == std::io::ErrorKind::NotFound { + ObjectStoreError::NotFound(cid.clone()) + } else { + ObjectStoreError::Io(e) + } + })?; + serde_json::from_slice(&bytes).map_err(ObjectStoreError::Deserialize) + } + + /// List all stored CIDs. + pub fn list(&self) -> Result, ObjectStoreError> { + if !self.base_dir.exists() { + return Ok(Vec::new()); + } + let mut cids = Vec::new(); + for entry in fs::read_dir(&self.base_dir).map_err(ObjectStoreError::Io)? { + let entry = entry.map_err(ObjectStoreError::Io)?; + if let Some(name) = entry.file_name().to_str() { + if let Ok(cid) = name.parse::() { + cids.push(cid); + } + } + } + Ok(cids) + } +} + +/// Errors from the local object store. +#[derive(Debug, thiserror::Error)] +pub enum ObjectStoreError { + /// Filesystem error. + #[error("I/O error: {0}")] + Io(#[from] std::io::Error), + /// Serialization failed. + #[error("serialization error: {0}")] + Serialize(serde_json::Error), + /// Deserialization failed. + #[error("deserialization error: {0}")] + Deserialize(serde_json::Error), + /// Object not found. + #[error("object not found: {0}")] + NotFound(BritCid), +} +``` + +- [ ] **Step 2.5: Export from engine/mod.rs** + +Edit `brit-epr/src/engine/mod.rs`: + +```rust +//! Covenant engine — unconditional layer that knows the trailer format and +//! dispatch contract but not any specific schema vocabulary. + +mod app_schema; +pub mod cid; +pub mod content_node; +mod error; +pub mod object_store; +mod trailer_block; +mod trailer_set; + +pub use app_schema::AppSchema; +pub use cid::{BritCid, CidParseError}; +pub use content_node::ContentNode; +pub use error::{EngineError, ValidationError}; +pub use object_store::{LocalObjectStore, ObjectStoreError}; +pub use trailer_block::parse_trailer_block; +pub use trailer_set::TrailerSet; +``` + +- [ ] **Step 2.6: Add tempfile dev-dependency** + +Edit `brit-epr/Cargo.toml`, add: + +```toml +[dev-dependencies] +tempfile = "3" +``` + +- [ ] **Step 2.7: Update lib.rs re-exports** + +Edit `brit-epr/src/lib.rs` — update unconditional re-exports: + +```rust +// Unconditional re-exports +pub use engine::{ + AppSchema, BritCid, CidParseError, ContentNode, LocalObjectStore, ObjectStoreError, TrailerSet, + ValidationError, +}; +``` + +- [ ] **Step 2.8: Run tests** + +Run: + +``` +cargo test -p brit-epr --test object_store +``` + +Expected: 4 tests pass. + +- [ ] **Step 2.9: Run all existing tests too** + +Run: + +``` +cargo test -p brit-epr +``` + +Expected: all previous tests (9 from Phase 0+1) plus the 5 CID unit tests plus 4 object store tests = 18 total pass. + +- [ ] **Step 2.10: Commit** + +``` +git add brit-epr/src/engine/content_node.rs brit-epr/src/engine/object_store.rs brit-epr/src/engine/mod.rs brit-epr/src/lib.rs brit-epr/Cargo.toml brit-epr/tests/object_store.rs +git commit -m "feat(brit-epr/engine): add ContentNode trait and LocalObjectStore + +ContentNode trait provides canonical JSON serialization and CID +derivation. LocalObjectStore stores nodes as CID-addressed JSON files +under .git/brit/objects/. Minimal foundation for Phase 2a attestation +types — the full Phase 2 adapter will extend this." +``` + +--- + +## Task 3: Engine — agent signing (ed25519) + +**Files:** +- Create: `brit-epr/src/engine/signing.rs` +- Modify: `brit-epr/src/engine/mod.rs` + +- [ ] **Step 3.1: Create `engine/signing.rs`** + +Create `brit-epr/src/engine/signing.rs`: + +```rust +//! Agent signing — ed25519 keypair management for attestation signatures. +//! +//! Phase 2a uses file-based keys at `.git/brit/agent-key` (PKCS#8 PEM). +//! Full agent key management (Holochain integration, key derivation from +//! device seed) comes in a later phase. + +use std::fs; +use std::path::{Path, PathBuf}; + +use ed25519_dalek::{Signer, SigningKey, VerifyingKey}; + +/// An agent's signing identity, loaded from or generated to a file. +pub struct AgentKey { + signing_key: SigningKey, + key_path: PathBuf, +} + +impl AgentKey { + /// Load an existing key or generate a new one at the given path. + pub fn load_or_generate(key_path: &Path) -> Result { + if key_path.exists() { + Self::load(key_path) + } else { + Self::generate(key_path) + } + } + + /// Load from an existing 32-byte seed file. + pub fn load(key_path: &Path) -> Result { + let bytes = fs::read(key_path).map_err(AgentKeyError::Io)?; + if bytes.len() != 32 { + return Err(AgentKeyError::InvalidKeyLength(bytes.len())); + } + let seed: [u8; 32] = bytes + .try_into() + .map_err(|_| AgentKeyError::InvalidKeyLength(0))?; + let signing_key = SigningKey::from_bytes(&seed); + Ok(Self { + signing_key, + key_path: key_path.to_path_buf(), + }) + } + + /// Generate a new keypair and write the 32-byte seed to disk. + pub fn generate(key_path: &Path) -> Result { + let mut rng = rand::thread_rng(); + let signing_key = SigningKey::generate(&mut rng); + + if let Some(parent) = key_path.parent() { + fs::create_dir_all(parent).map_err(AgentKeyError::Io)?; + } + fs::write(key_path, signing_key.to_bytes()).map_err(AgentKeyError::Io)?; + + Ok(Self { + signing_key, + key_path: key_path.to_path_buf(), + }) + } + + /// Sign arbitrary bytes. Returns the 64-byte ed25519 signature as hex. + pub fn sign(&self, payload: &[u8]) -> String { + let sig = self.signing_key.sign(payload); + hex::encode(sig.to_bytes()) + } + + /// The agent's public key as a 64-character hex string. + pub fn agent_id(&self) -> String { + hex::encode(self.signing_key.verifying_key().to_bytes()) + } + + /// The verifying (public) key. + pub fn verifying_key(&self) -> VerifyingKey { + self.signing_key.verifying_key() + } + + /// Path where the key is stored. + pub fn key_path(&self) -> &Path { + &self.key_path + } +} + +/// Verify a hex-encoded signature against a hex-encoded public key. +pub fn verify_signature( + payload: &[u8], + signature_hex: &str, + pubkey_hex: &str, +) -> Result { + let sig_bytes = + hex::decode(signature_hex).map_err(|_| AgentKeyError::InvalidSignatureHex)?; + let sig = ed25519_dalek::Signature::from_slice(&sig_bytes) + .map_err(|_| AgentKeyError::InvalidSignatureHex)?; + + let pub_bytes = hex::decode(pubkey_hex).map_err(|_| AgentKeyError::InvalidPubkeyHex)?; + let pubkey = VerifyingKey::from_bytes( + &pub_bytes + .try_into() + .map_err(|_| AgentKeyError::InvalidPubkeyHex)?, + ) + .map_err(|_| AgentKeyError::InvalidPubkeyHex)?; + + Ok(pubkey.verify_strict(payload, &sig).is_ok()) +} + +/// Agent key errors. +#[derive(Debug, thiserror::Error)] +pub enum AgentKeyError { + /// Filesystem error. + #[error("I/O error: {0}")] + Io(std::io::Error), + /// Key file has wrong length. + #[error("expected 32-byte key seed, got {0} bytes")] + InvalidKeyLength(usize), + /// Signature hex is invalid. + #[error("invalid signature hex")] + InvalidSignatureHex, + /// Public key hex is invalid. + #[error("invalid public key hex")] + InvalidPubkeyHex, +} + +#[cfg(test)] +mod tests { + use super::*; + use tempfile::TempDir; + + #[test] + fn generate_load_roundtrip() { + let tmp = TempDir::new().unwrap(); + let path = tmp.path().join("brit").join("agent-key"); + + let key1 = AgentKey::generate(&path).unwrap(); + let key2 = AgentKey::load(&path).unwrap(); + + assert_eq!(key1.agent_id(), key2.agent_id()); + } + + #[test] + fn sign_and_verify() { + let tmp = TempDir::new().unwrap(); + let key = AgentKey::generate(&tmp.path().join("key")).unwrap(); + + let payload = b"attestation payload"; + let sig = key.sign(payload); + + assert!(verify_signature(payload, &sig, &key.agent_id()).unwrap()); + } + + #[test] + fn wrong_payload_fails_verify() { + let tmp = TempDir::new().unwrap(); + let key = AgentKey::generate(&tmp.path().join("key")).unwrap(); + + let sig = key.sign(b"original"); + + assert!(!verify_signature(b"tampered", &sig, &key.agent_id()).unwrap()); + } + + #[test] + fn load_or_generate_creates_if_missing() { + let tmp = TempDir::new().unwrap(); + let path = tmp.path().join("agent-key"); + + assert!(!path.exists()); + let key = AgentKey::load_or_generate(&path).unwrap(); + assert!(path.exists()); + assert_eq!(key.agent_id().len(), 64); // 32 bytes as hex + } +} +``` + +- [ ] **Step 3.2: Add hex dependency** + +Edit `brit-epr/Cargo.toml`, add to `[dependencies]`: + +```toml +hex = "0.4" +``` + +- [ ] **Step 3.3: Export from engine/mod.rs** + +Edit `brit-epr/src/engine/mod.rs` — add: + +```rust +pub mod signing; +``` + +And add to the pub use block: + +```rust +pub use signing::{verify_signature, AgentKey, AgentKeyError}; +``` + +- [ ] **Step 3.4: Run tests** + +Run: + +``` +cargo test -p brit-epr -- signing +``` + +Expected: 4 tests pass. + +- [ ] **Step 3.5: Commit** + +``` +git add brit-epr/src/engine/signing.rs brit-epr/src/engine/mod.rs brit-epr/Cargo.toml Cargo.lock +git commit -m "feat(brit-epr/engine): add ed25519 agent signing + +File-based ed25519 key at .git/brit/agent-key. Sign/verify with hex- +encoded signatures and public keys. Full agent key management (Holochain +integration) deferred to later phase." +``` + +--- + +## Task 4: Elohim — attestation schemas (all three types) + +**Files:** +- Create: `brit-epr/src/elohim/attestation/mod.rs` +- Create: `brit-epr/src/elohim/attestation/build.rs` +- Create: `brit-epr/src/elohim/attestation/deploy.rs` +- Create: `brit-epr/src/elohim/attestation/validation.rs` +- Modify: `brit-epr/src/elohim/mod.rs` +- Create: `brit-epr/tests/attestation_roundtrip.rs` + +- [ ] **Step 4.1: Write the failing test** + +Create `brit-epr/tests/attestation_roundtrip.rs`: + +```rust +//! Serde roundtrip tests for all three attestation ContentNode types. + +use brit_epr::engine::cid::BritCid; +use brit_epr::engine::content_node::ContentNode; +use brit_epr::elohim::attestation::build::BuildAttestationContentNode; +use brit_epr::elohim::attestation::deploy::{DeployAttestationContentNode, HealthStatus}; +use brit_epr::elohim::attestation::validation::{ + ValidationAttestationContentNode, ValidationResult, +}; + +fn sample_cid() -> BritCid { + BritCid::compute(b"sample artifact") +} + +#[test] +fn build_attestation_roundtrips() { + let node = BuildAttestationContentNode { + manifest_cid: sample_cid(), + step_name: "elohim-edge:cargo-build-storage".into(), + inputs_hash: "abc123def456".into(), + output_cid: BritCid::compute(b"output artifact"), + agent_id: "deadbeef".repeat(4), + hardware_profile: serde_json::json!({ + "arch": "x86_64", + "os": "linux", + "memory_gb": 32 + }), + build_duration_ms: 45_000, + built_at: "2026-04-16T10:00:00Z".into(), + success: true, + signature: "sig_placeholder".into(), + }; + + let json = serde_json::to_string_pretty(&node).unwrap(); + let back: BuildAttestationContentNode = serde_json::from_str(&json).unwrap(); + assert_eq!(node, back); + assert_eq!(node.content_type(), "brit.build-attestation"); + + // CID is deterministic + let cid1 = node.compute_cid().unwrap(); + let cid2 = back.compute_cid().unwrap(); + assert_eq!(cid1, cid2); +} + +#[test] +fn deploy_attestation_roundtrips() { + let node = DeployAttestationContentNode { + artifact_cid: sample_cid(), + step_name: "elohim-edge:cargo-build-storage".into(), + environment_label: "staging".into(), + endpoint: "https://staging.elohim.host".into(), + health_check_url: "https://staging.elohim.host/health".into(), + health_status: HealthStatus::Healthy, + deployed_at: "2026-04-16T10:05:00Z".into(), + attested_at: "2026-04-16T10:05:30Z".into(), + liveness_ttl_sec: 300, + agent_id: "deadbeef".repeat(4), + signature: "sig_placeholder".into(), + }; + + let json = serde_json::to_string_pretty(&node).unwrap(); + let back: DeployAttestationContentNode = serde_json::from_str(&json).unwrap(); + assert_eq!(node, back); + assert_eq!(node.content_type(), "brit.deploy-attestation"); +} + +#[test] +fn validation_attestation_roundtrips() { + let node = ValidationAttestationContentNode { + artifact_cid: sample_cid(), + check_name: "sonarqube-scan@v10".into(), + validator_id: "sonarqube-agent-001".into(), + validator_version: "10.7.0".into(), + result: ValidationResult::Pass, + result_summary: "0 bugs, 0 vulnerabilities, 2 code smells".into(), + findings_cid: None, + validated_at: "2026-04-16T10:10:00Z".into(), + ttl_sec: Some(86_400), + signature: "sig_placeholder".into(), + }; + + let json = serde_json::to_string_pretty(&node).unwrap(); + let back: ValidationAttestationContentNode = serde_json::from_str(&json).unwrap(); + assert_eq!(node, back); + assert_eq!(node.content_type(), "brit.validation-attestation"); +} + +#[test] +fn validation_result_serializes_as_lowercase() { + let pass = serde_json::to_string(&ValidationResult::Pass).unwrap(); + assert_eq!(pass, r#""pass""#); + + let fail = serde_json::to_string(&ValidationResult::Fail).unwrap(); + assert_eq!(fail, r#""fail""#); + + let warn = serde_json::to_string(&ValidationResult::Warn).unwrap(); + assert_eq!(warn, r#""warn""#); + + let skip = serde_json::to_string(&ValidationResult::Skip).unwrap(); + assert_eq!(skip, r#""skip""#); +} + +#[test] +fn health_status_serializes_as_lowercase() { + let h = serde_json::to_string(&HealthStatus::Healthy).unwrap(); + assert_eq!(h, r#""healthy""#); + + let d = serde_json::to_string(&HealthStatus::Degraded).unwrap(); + assert_eq!(d, r#""degraded""#); + + let u = serde_json::to_string(&HealthStatus::Unreachable).unwrap(); + assert_eq!(u, r#""unreachable""#); +} +``` + +- [ ] **Step 4.2: Run — expect compile failure** + +Run: + +``` +cargo test -p brit-epr --test attestation_roundtrip +``` + +Expected: compile errors — attestation modules don't exist. + +- [ ] **Step 4.3: Create `elohim/attestation/mod.rs`** + +Create `brit-epr/src/elohim/attestation/mod.rs`: + +```rust +//! Attestation ContentNode types for the Elohim Protocol. +//! +//! Three types: build (artifact was produced), deploy (artifact is live), +//! validation (artifact passed a named check). See Phase 2a spec for +//! field-by-field documentation. + +pub mod build; +pub mod deploy; +pub mod validation; +``` + +- [ ] **Step 4.4: Create `elohim/attestation/build.rs`** + +Create `brit-epr/src/elohim/attestation/build.rs`: + +```rust +//! `BuildAttestationContentNode` �� records that an agent produced an +//! output artifact from a manifest's inputs. + +use serde::{Deserialize, Serialize}; + +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; + +/// Records that an agent produced an output artifact from a manifest's inputs. +/// +/// Pillar coupling: +/// - Lamad: `build-knowledge` — what was built, from what, how +/// - Shefa: `compute-expended` — economic cost of producing the artifact +/// - Qahal: `build-authority` — agent's right to attest this artifact +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct BuildAttestationContentNode { + /// CID of the BuildManifestContentNode this attestation is for. + pub manifest_cid: BritCid, + /// Qualified step name (e.g., `elohim-edge:cargo-build-storage`). + pub step_name: String, + /// Content hash of all declared inputs at build time. + pub inputs_hash: String, + /// Content-addressed output artifact. + pub output_cid: BritCid, + /// Hex-encoded public key of the peer that performed the build. + pub agent_id: String, + /// CPU arch, OS, memory, relevant toolchain versions. + pub hardware_profile: serde_json::Value, + /// Wall-clock build time in milliseconds. + pub build_duration_ms: u64, + /// ISO-8601 timestamp when the build completed. + pub built_at: String, + /// Did the build succeed. + pub success: bool, + /// Hex-encoded ed25519 signature over the canonical JSON payload. + pub signature: String, +} + +impl ContentNode for BuildAttestationContentNode { + fn content_type(&self) -> &'static str { + "brit.build-attestation" + } +} +``` + +- [ ] **Step 4.5: Create `elohim/attestation/deploy.rs`** + +Create `brit-epr/src/elohim/attestation/deploy.rs`: + +```rust +//! `DeployAttestationContentNode` — records that an agent confirms an +//! artifact is live at an environment. + +use serde::{Deserialize, Serialize}; + +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; + +/// Health status of a deployed artifact. +#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum HealthStatus { + /// Service is healthy. + Healthy, + /// Service is degraded but responding. + Degraded, + /// Service is unreachable. + Unreachable, +} + +/// Records that an agent confirms an artifact is live at an environment. +/// +/// Pillar coupling: +/// - Lamad: `deployment-knowledge` — what is running where +/// - Shefa: `serving-compute` — cost of hosting/serving +/// - Qahal: `environment-authority` — agent's right to attest this environment +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct DeployAttestationContentNode { + /// CID of the output artifact being attested. + pub artifact_cid: BritCid, + /// Which step's artifact this is. + pub step_name: String, + /// `alpha`, `staging`, `prod`, `self`, or custom. + pub environment_label: String, + /// URL or service address being verified. + pub endpoint: String, + /// Endpoint used to verify liveness. + pub health_check_url: String, + /// Current health status. + pub health_status: HealthStatus, + /// ISO-8601 when the artifact started serving here. + pub deployed_at: String, + /// ISO-8601 when this attestation was produced. + pub attested_at: String, + /// After this many seconds without re-attestation, the claim self-invalidates. + pub liveness_ttl_sec: u64, + /// Hex-encoded public key of the peer producing the attestation. + pub agent_id: String, + /// Hex-encoded ed25519 signature. + pub signature: String, +} + +impl ContentNode for DeployAttestationContentNode { + fn content_type(&self) -> &'static str { + "brit.deploy-attestation" + } +} +``` + +- [ ] **Step 4.6: Create `elohim/attestation/validation.rs`** + +Create `brit-epr/src/elohim/attestation/validation.rs`: + +```rust +//! `ValidationAttestationContentNode` — records that a validator applied +//! a named check to an artifact. + +use serde::{Deserialize, Serialize}; + +use crate::engine::cid::BritCid; +use crate::engine::content_node::ContentNode; + +/// Result of a validation check. +#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum ValidationResult { + /// Check passed. + Pass, + /// Check failed. + Fail, + /// Check produced warnings but did not fail. + Warn, + /// Check was skipped (e.g., not applicable to this artifact). + Skip, +} + +/// Records that a validator (tool or agent) applied a named check to an artifact. +/// +/// Check vocabulary is governed by the AppManifest. A check is only recognized +/// if its `check_name` is registered in the current manifest version. +/// +/// Pillar coupling: +/// - Lamad: `validation-knowledge` — the findings +/// - Shefa: `verification-compute` — cost of running the check +/// - Qahal: `validation-authority` — community's recognition that this check counts +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ValidationAttestationContentNode { + /// CID of what was validated. + pub artifact_cid: BritCid, + /// Registered check identifier (e.g., `sonarqube-scan@v10`, `trivy-cve@latest`). + pub check_name: String, + /// Tool identity or agent pubkey. + pub validator_id: String, + /// Version of the tool/agent. + pub validator_version: String, + /// Check result. + pub result: ValidationResult, + /// Human-readable summary. + pub result_summary: String, + /// Optional detailed report (CID of findings document). + pub findings_cid: Option, + /// ISO-8601 when validation was performed. + pub validated_at: String, + /// When validation goes stale (e.g., CVE DB refresh interval). None = never stale. + pub ttl_sec: Option, + /// Hex-encoded ed25519 signature. + pub signature: String, +} + +impl ContentNode for ValidationAttestationContentNode { + fn content_type(&self) -> &'static str { + "brit.validation-attestation" + } +} +``` + +- [ ] **Step 4.7: Wire up elohim/mod.rs** + +Edit `brit-epr/src/elohim/mod.rs` — add `attestation` module: + +```rust +//! Elohim Protocol app schema — first-party `AppSchema` implementation. +//! +//! Gated behind `#[cfg(feature = "elohim-protocol")]`. With this feature +//! disabled, `brit-epr` ships only the engine. + +pub mod attestation; +mod parse; +mod pillar_trailers; +mod schema; +mod validate; + +pub use parse::parse_pillar_trailers; +pub use pillar_trailers::{PillarTrailers, TrailerKey}; +pub use schema::ElohimProtocolSchema; +pub use validate::{validate_pillar_trailers, PillarValidationError}; +``` + +- [ ] **Step 4.8: Run tests** + +Run: + +``` +cargo test -p brit-epr --test attestation_roundtrip +``` + +Expected: 5 tests pass. + +- [ ] **Step 4.9: Commit** + +``` +git add brit-epr/src/elohim/attestation/ brit-epr/src/elohim/mod.rs brit-epr/tests/attestation_roundtrip.rs +git commit -m "feat(brit-epr/elohim): add three attestation ContentNode schemas + +BuildAttestationContentNode, DeployAttestationContentNode, and +ValidationAttestationContentNode — serde-serializable, CID-derivable, +camelCase JSON. All three round-trip through serde with deterministic +CIDs. Field sets match Phase 2a spec exactly." +``` + +--- + +## Task 5: Elohim — git ref namespace management + +**Files:** +- Create: `brit-epr/src/elohim/refs.rs` +- Modify: `brit-epr/src/elohim/mod.rs` +- Create: `brit-epr/tests/ref_management.rs` + +- [ ] **Step 5.1: Write the failing test** + +Create `brit-epr/tests/ref_management.rs`: + +```rust +//! Tests for refs/notes/brit/ namespace management. +//! +//! These tests use a temp git repo to exercise real ref operations. + +use brit_epr::elohim::refs::BritRefManager; +use std::process::Command; +use tempfile::TempDir; + +fn init_git_repo() -> TempDir { + let tmp = TempDir::new().unwrap(); + Command::new("git") + .args(["init", "--initial-branch=main"]) + .current_dir(tmp.path()) + .output() + .unwrap(); + Command::new("git") + .args(["-c", "user.email=test@test.com", "-c", "user.name=test", "commit", "--allow-empty", "-m", "init"]) + .current_dir(tmp.path()) + .output() + .unwrap(); + tmp +} + +#[test] +fn put_and_get_build_ref() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + + let payload = serde_json::json!({ + "attestationCid": "abc123", + "outputCid": "def456", + "agentId": "agent001", + "builtAt": "2026-04-16T10:00:00Z" + }); + + mgr.put_build_ref("elohim-edge:storage", "HEAD", &payload) + .unwrap(); + + let got = mgr.get_build_ref("elohim-edge:storage", "HEAD").unwrap(); + assert_eq!(got, Some(payload)); +} + +#[test] +fn get_missing_ref_returns_none() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + + let got = mgr.get_build_ref("nonexistent", "HEAD").unwrap(); + assert_eq!(got, None); +} + +#[test] +fn put_and_get_deploy_ref() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + + let payload = serde_json::json!({ + "artifactCid": "abc123", + "healthStatus": "healthy" + }); + + mgr.put_deploy_ref("elohim-edge:storage", "staging", &payload) + .unwrap(); + + let got = mgr + .get_deploy_ref("elohim-edge:storage", "staging") + .unwrap(); + assert_eq!(got, Some(payload)); +} + +#[test] +fn put_and_get_validate_ref() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + + let payload = serde_json::json!({ + "artifactCid": "abc123", + "result": "pass" + }); + + mgr.put_validate_ref("elohim-edge:storage", "sonarqube-scan@v10", &payload) + .unwrap(); + + let got = mgr + .get_validate_ref("elohim-edge:storage", "sonarqube-scan@v10") + .unwrap(); + assert_eq!(got, Some(payload)); +} + +#[test] +fn list_build_refs() { + let tmp = init_git_repo(); + let mgr = BritRefManager::new(tmp.path()).unwrap(); + + mgr.put_build_ref("step-a", "HEAD", &serde_json::json!({"a": 1})) + .unwrap(); + mgr.put_build_ref("step-b", "HEAD", &serde_json::json!({"b": 2})) + .unwrap(); + + let mut refs = mgr.list_build_refs(None).unwrap(); + refs.sort(); + + assert_eq!(refs, vec!["step-a", "step-b"]); +} +``` + +- [ ] **Step 5.2: Run — expect compile failure** + +Run: + +``` +cargo test -p brit-epr --test ref_management +``` + +Expected: compile error — `refs` module doesn't exist. + +- [ ] **Step 5.3: Create `elohim/refs.rs`** + +Create `brit-epr/src/elohim/refs.rs`: + +```rust +//! `BritRefManager` — read/write/list git refs under `refs/notes/brit/`. +//! +//! Uses git CLI commands for ref operations. A future iteration may use gix's +//! ref store API directly, but git CLI is simpler and more reliable for notes +//! refs (which gix doesn't fully support as of 0.81). +//! +//! Ref layout: +//! - `refs/notes/brit/build/{step_name}` — build attestation per commit +//! - `refs/notes/brit/deploy/{step_name}/{env}` — deploy attestation +//! - `refs/notes/brit/validate/{step_name}/{check_name}` — validation attestation +//! - `refs/notes/brit/reach/{step_name}` — derived reach level + +use std::path::{Path, PathBuf}; +use std::process::Command; + +/// Manages git refs under `refs/notes/brit/` for attestation indexing. +pub struct BritRefManager { + repo_path: PathBuf, +} + +impl BritRefManager { + /// Create a manager for the given repo path. + pub fn new(repo_path: &Path) -> Result { + if !repo_path.join(".git").exists() && !repo_path.join("HEAD").exists() { + return Err(RefError::NotARepo(repo_path.display().to_string())); + } + Ok(Self { + repo_path: repo_path.to_path_buf(), + }) + } + + // --- Build refs --- + + /// Write a build attestation ref for a step at a commit. + pub fn put_build_ref( + &self, + step_name: &str, + commit_rev: &str, + payload: &serde_json::Value, + ) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/build/{step_name}"); + let commit_sha = self.resolve_rev(commit_rev)?; + self.write_note(&ref_name, &commit_sha, payload) + } + + /// Read the build attestation for a step at a commit. + pub fn get_build_ref( + &self, + step_name: &str, + commit_rev: &str, + ) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/build/{step_name}"); + let commit_sha = self.resolve_rev(commit_rev)?; + self.read_note(&ref_name, &commit_sha) + } + + /// List all build ref step names, optionally filtered by pattern. + pub fn list_build_refs( + &self, + pattern: Option<&str>, + ) -> Result, RefError> { + self.list_refs_under("refs/notes/brit/build/", pattern) + } + + // --- Deploy refs --- + + /// Write a deploy attestation ref for a step + environment. + pub fn put_deploy_ref( + &self, + step_name: &str, + env: &str, + payload: &serde_json::Value, + ) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/deploy/{step_name}/{env}"); + self.write_ref_blob(&ref_name, payload) + } + + /// Read the deploy attestation for a step + environment. + pub fn get_deploy_ref( + &self, + step_name: &str, + env: &str, + ) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/deploy/{step_name}/{env}"); + self.read_ref_blob(&ref_name) + } + + /// List all deploy ref step names. + pub fn list_deploy_refs( + &self, + pattern: Option<&str>, + ) -> Result, RefError> { + self.list_refs_under("refs/notes/brit/deploy/", pattern) + } + + // --- Validate refs --- + + /// Write a validation attestation ref for a step + check. + pub fn put_validate_ref( + &self, + step_name: &str, + check_name: &str, + payload: &serde_json::Value, + ) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/validate/{step_name}/{check_name}"); + self.write_ref_blob(&ref_name, payload) + } + + /// Read the validation attestation for a step + check. + pub fn get_validate_ref( + &self, + step_name: &str, + check_name: &str, + ) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/validate/{step_name}/{check_name}"); + self.read_ref_blob(&ref_name) + } + + /// List all validate ref step names. + pub fn list_validate_refs( + &self, + pattern: Option<&str>, + ) -> Result, RefError> { + self.list_refs_under("refs/notes/brit/validate/", pattern) + } + + // --- Reach refs --- + + /// Write a reach ref for a step. + pub fn put_reach_ref( + &self, + step_name: &str, + payload: &serde_json::Value, + ) -> Result<(), RefError> { + let ref_name = format!("refs/notes/brit/reach/{step_name}"); + self.write_ref_blob(&ref_name, payload) + } + + /// Read the reach ref for a step. + pub fn get_reach_ref( + &self, + step_name: &str, + ) -> Result, RefError> { + let ref_name = format!("refs/notes/brit/reach/{step_name}"); + self.read_ref_blob(&ref_name) + } + + // --- Internal helpers --- + + fn resolve_rev(&self, rev: &str) -> Result { + let output = Command::new("git") + .args(["rev-parse", rev]) + .current_dir(&self.repo_path) + .output() + .map_err(RefError::GitCommand)?; + + if !output.status.success() { + return Err(RefError::RevNotFound(rev.to_string())); + } + Ok(String::from_utf8_lossy(&output.stdout).trim().to_string()) + } + + /// Write a git note against a specific commit. + fn write_note( + &self, + ref_name: &str, + commit_sha: &str, + payload: &serde_json::Value, + ) -> Result<(), RefError> { + let json = serde_json::to_string(payload).map_err(RefError::Json)?; + let output = Command::new("git") + .args(["notes", "--ref", ref_name, "add", "-f", "-m", &json, commit_sha]) + .current_dir(&self.repo_path) + .output() + .map_err(RefError::GitCommand)?; + + if !output.status.success() { + let stderr = String::from_utf8_lossy(&output.stderr); + return Err(RefError::GitFailed(format!( + "notes add to {ref_name}: {stderr}" + ))); + } + Ok(()) + } + + /// Read a git note for a specific commit. + fn read_note( + &self, + ref_name: &str, + commit_sha: &str, + ) -> Result, RefError> { + let output = Command::new("git") + .args(["notes", "--ref", ref_name, "show", commit_sha]) + .current_dir(&self.repo_path) + .output() + .map_err(RefError::GitCommand)?; + + if !output.status.success() { + return Ok(None); + } + let text = String::from_utf8_lossy(&output.stdout); + let value = serde_json::from_str(text.trim()).map_err(RefError::Json)?; + Ok(Some(value)) + } + + /// Write a JSON blob to a standalone ref (for deploy/validate/reach refs + /// which are not per-commit). + fn write_ref_blob( + &self, + ref_name: &str, + payload: &serde_json::Value, + ) -> Result<(), RefError> { + let json = serde_json::to_string(payload).map_err(RefError::Json)?; + + // Write the JSON as a blob object + let hash_output = Command::new("git") + .args(["hash-object", "-w", "--stdin"]) + .current_dir(&self.repo_path) + .stdin(std::process::Stdio::piped()) + .stdout(std::process::Stdio::piped()) + .spawn() + .and_then(|mut child| { + use std::io::Write; + child.stdin.take().unwrap().write_all(json.as_bytes())?; + child.wait_with_output() + }) + .map_err(RefError::GitCommand)?; + + if !hash_output.status.success() { + return Err(RefError::GitFailed("hash-object failed".into())); + } + let blob_sha = String::from_utf8_lossy(&hash_output.stdout) + .trim() + .to_string(); + + // Point the ref at the blob + let update_output = Command::new("git") + .args(["update-ref", ref_name, &blob_sha]) + .current_dir(&self.repo_path) + .output() + .map_err(RefError::GitCommand)?; + + if !update_output.status.success() { + let stderr = String::from_utf8_lossy(&update_output.stderr); + return Err(RefError::GitFailed(format!( + "update-ref {ref_name}: {stderr}" + ))); + } + Ok(()) + } + + /// Read a JSON blob from a standalone ref. + fn read_ref_blob( + &self, + ref_name: &str, + ) -> Result, RefError> { + let output = Command::new("git") + .args(["cat-file", "-p", ref_name]) + .current_dir(&self.repo_path) + .output() + .map_err(RefError::GitCommand)?; + + if !output.status.success() { + return Ok(None); + } + let text = String::from_utf8_lossy(&output.stdout); + let value = serde_json::from_str(text.trim()).map_err(RefError::Json)?; + Ok(Some(value)) + } + + /// List ref names under a prefix, extracting the suffix as the step name. + fn list_refs_under( + &self, + prefix: &str, + _pattern: Option<&str>, + ) -> Result, RefError> { + let output = Command::new("git") + .args(["for-each-ref", "--format=%(refname)", prefix]) + .current_dir(&self.repo_path) + .output() + .map_err(RefError::GitCommand)?; + + if !output.status.success() { + return Ok(Vec::new()); + } + + let text = String::from_utf8_lossy(&output.stdout); + let names: Vec = text + .lines() + .filter(|l| !l.is_empty()) + .filter_map(|line| line.strip_prefix(prefix)) + .map(|s| s.to_string()) + .collect(); + Ok(names) + } +} + +/// Ref management errors. +#[derive(Debug, thiserror::Error)] +pub enum RefError { + /// Not a git repository. + #[error("not a git repository: {0}")] + NotARepo(String), + /// Git rev not found. + #[error("rev not found: {0}")] + RevNotFound(String), + /// Git command failed to execute. + #[error("git command error: {0}")] + GitCommand(std::io::Error), + /// Git command returned non-zero. + #[error("git command failed: {0}")] + GitFailed(String), + /// JSON serialization/deserialization error. + #[error("JSON error: {0}")] + Json(serde_json::Error), +} +``` + +- [ ] **Step 5.4: Wire up elohim/mod.rs** + +Edit `brit-epr/src/elohim/mod.rs` — add `pub mod refs;`: + +```rust +pub mod attestation; +mod parse; +mod pillar_trailers; +pub mod refs; +mod schema; +mod validate; +``` + +- [ ] **Step 5.5: Run tests** + +Run: + +``` +cargo test -p brit-epr --test ref_management +``` + +Expected: 5 tests pass. + +- [ ] **Step 5.6: Commit** + +``` +git add brit-epr/src/elohim/refs.rs brit-epr/src/elohim/mod.rs brit-epr/tests/ref_management.rs +git commit -m "feat(brit-epr/elohim): add BritRefManager for refs/notes/brit/ namespace + +Read/write/list git refs for build, deploy, validate, and reach +attestations. Build refs use git notes (per-commit). Deploy/validate/ +reach refs use standalone blob refs. All under refs/notes/brit/ to +survive clone/fetch." +``` + +--- + +## Task 6: Elohim — reach computation + +**Files:** +- Create: `brit-epr/src/elohim/attestation/reach.rs` +- Create: `brit-epr/tests/reach_computation.rs` + +- [ ] **Step 6.1: Write the failing test** + +Create `brit-epr/tests/reach_computation.rs`: + +```rust +//! Tests for deterministic reach computation from attestations. + +use brit_epr::elohim::attestation::reach::{compute_reach, ReachInput, ReachLevel}; + +#[test] +fn no_attestations_returns_unknown() { + let input = ReachInput { + build_attestations: vec![], + deploy_attestations: vec![], + validation_attestations: vec![], + }; + assert_eq!(compute_reach(&input), ReachLevel::Unknown); +} + +#[test] +fn build_only_returns_built() { + let input = ReachInput { + build_attestations: vec!["agent-a".into()], + deploy_attestations: vec![], + validation_attestations: vec![], + }; + assert_eq!(compute_reach(&input), ReachLevel::Built); +} + +#[test] +fn build_plus_deploy_returns_deployed() { + let input = ReachInput { + build_attestations: vec!["agent-a".into()], + deploy_attestations: vec!["staging".into()], + validation_attestations: vec![], + }; + assert_eq!(compute_reach(&input), ReachLevel::Deployed); +} + +#[test] +fn build_plus_deploy_plus_validation_returns_verified() { + let input = ReachInput { + build_attestations: vec!["agent-a".into()], + deploy_attestations: vec!["staging".into()], + validation_attestations: vec!["sonarqube-scan@v10".into()], + }; + assert_eq!(compute_reach(&input), ReachLevel::Verified); +} + +#[test] +fn same_inputs_same_result() { + let input = ReachInput { + build_attestations: vec!["agent-a".into(), "agent-b".into()], + deploy_attestations: vec!["staging".into()], + validation_attestations: vec!["trivy@latest".into(), "sonarqube@v10".into()], + }; + let r1 = compute_reach(&input); + let r2 = compute_reach(&input); + assert_eq!(r1, r2, "reach computation must be deterministic"); +} +``` + +- [ ] **Step 6.2: Run — expect compile failure** + +Run: + +``` +cargo test -p brit-epr --test reach_computation +``` + +Expected: compile error — `reach` module doesn't exist. + +- [ ] **Step 6.3: Create `elohim/attestation/reach.rs`** + +Create `brit-epr/src/elohim/attestation/reach.rs`: + +```rust +//! Reach computation — derives a reach level from existing attestations. +//! +//! Deterministic: same attestations → same reach level. This is the Phase 2a +//! local-only computation. The full reach-promotion-rule DSL (how AppManifest +//! declares "build + 3 diverse peers + sonarqube pass = community reach") is +//! deferred to a later phase. + +use serde::{Deserialize, Serialize}; + +/// Reach level derived from attestations for a given step. +/// +/// Levels are ordered: Unknown < Built < Deployed < Verified. +/// Phase 2a uses this simple ladder. The full reach taxonomy from the +/// protocol schema (personal, trusted, community, public) maps onto +/// this in a later phase when the AppManifest reach-promotion rules +/// are defined. +#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Serialize, Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum ReachLevel { + /// No attestations exist. + Unknown, + /// At least one build attestation exists. + Built, + /// Built + at least one deploy attestation. + Deployed, + /// Built + deployed + at least one validation attestation passes. + Verified, +} + +/// Input for reach computation — collected from existing attestation refs. +#[derive(Debug, Clone)] +pub struct ReachInput { + /// Agent IDs that have attested successful builds. + pub build_attestations: Vec, + /// Environment labels with active deploy attestations. + pub deploy_attestations: Vec, + /// Check names with passing validation attestations. + pub validation_attestations: Vec, +} + +/// Compute the reach level from collected attestation data. +/// +/// Deterministic: same inputs → same output. Order of attestations +/// within each category does not matter. +pub fn compute_reach(input: &ReachInput) -> ReachLevel { + let has_build = !input.build_attestations.is_empty(); + let has_deploy = !input.deploy_attestations.is_empty(); + let has_validation = !input.validation_attestations.is_empty(); + + match (has_build, has_deploy, has_validation) { + (true, true, true) => ReachLevel::Verified, + (true, true, false) => ReachLevel::Deployed, + (true, false, _) => ReachLevel::Built, + (false, _, _) => ReachLevel::Unknown, + } +} +``` + +- [ ] **Step 6.4: Wire up attestation/mod.rs** + +Edit `brit-epr/src/elohim/attestation/mod.rs`: + +```rust +//! Attestation ContentNode types for the Elohim Protocol. +//! +//! Three types: build (artifact was produced), deploy (artifact is live), +//! validation (artifact passed a named check). See Phase 2a spec for +//! field-by-field documentation. + +pub mod build; +pub mod deploy; +pub mod reach; +pub mod validation; +``` + +- [ ] **Step 6.5: Run tests** + +Run: + +``` +cargo test -p brit-epr --test reach_computation +``` + +Expected: 5 tests pass. + +- [ ] **Step 6.6: Commit** + +``` +git add brit-epr/src/elohim/attestation/reach.rs brit-epr/src/elohim/attestation/mod.rs brit-epr/tests/reach_computation.rs +git commit -m "feat(brit-epr/elohim): add deterministic reach computation + +ReachLevel ladder: Unknown → Built → Deployed → Verified. Derived from +attestation presence. Deterministic: same inputs = same output. Full +reach-promotion-rule DSL deferred to AppManifest work." +``` + +--- + +## Task 7: Build the `brit-build-ref` CLI + +**Files:** +- Create: `brit-build-ref/Cargo.toml` +- Create: `brit-build-ref/src/main.rs` +- Create: `brit-build-ref/src/build_cmd.rs` +- Create: `brit-build-ref/src/deploy_cmd.rs` +- Create: `brit-build-ref/src/validate_cmd.rs` +- Create: `brit-build-ref/src/reach_cmd.rs` +- Modify: `Cargo.toml` (root — add workspace member) + +- [ ] **Step 7.1: Create the binary manifest** + +Create `brit-build-ref/Cargo.toml`: + +```toml +lints.workspace = true + +[package] +name = "brit-build-ref" +version = "0.0.0" +description = "Manage build, deploy, and validation attestation refs in a brit repo" +repository = "https://github.com/ethosengine/brit" +authors = ["Matthew Dowell "] +license = "MIT OR Apache-2.0" +edition = "2021" +rust-version = "1.82" + +[[bin]] +name = "brit-build-ref" +path = "src/main.rs" + +[dependencies] +brit-epr = { version = "^0.0.0", path = "../brit-epr" } +clap = { version = "4", features = ["derive"] } +serde_json = "1" +``` + +- [ ] **Step 7.2: Add to workspace members** + +Edit root `Cargo.toml`. Find the `members = [` list and add `"brit-build-ref"` after `"brit-verify"`: + +```toml + "brit-epr", + "brit-verify", + "brit-build-ref", +] +``` + +- [ ] **Step 7.3: Create the clap entrypoint** + +Create `brit-build-ref/src/main.rs`: + +```rust +//! `brit build-ref` — manage attestation refs in a brit repo. +//! +//! Usage: `brit-build-ref [options]` + +use std::process::ExitCode; + +use clap::{Parser, Subcommand}; + +mod build_cmd; +mod deploy_cmd; +mod reach_cmd; +mod validate_cmd; + +#[derive(Parser)] +#[command(name = "brit-build-ref")] +#[command(about = "Manage build, deploy, and validation attestation refs")] +struct Cli { + /// Path to the git repository (defaults to current directory). + #[arg(long, default_value = ".")] + repo: String, + + #[command(subcommand)] + command: Commands, +} + +#[derive(Subcommand)] +enum Commands { + /// Build attestation management. + Build { + #[command(subcommand)] + action: build_cmd::BuildAction, + }, + /// Deploy attestation management. + Deploy { + #[command(subcommand)] + action: deploy_cmd::DeployAction, + }, + /// Validation attestation management. + Validate { + #[command(subcommand)] + action: validate_cmd::ValidateAction, + }, + /// Reach level management. + Reach { + #[command(subcommand)] + action: reach_cmd::ReachAction, + }, +} + +fn main() -> ExitCode { + let cli = Cli::parse(); + let repo_path = std::path::Path::new(&cli.repo); + + let result = match cli.command { + Commands::Build { action } => build_cmd::run(repo_path, action), + Commands::Deploy { action } => deploy_cmd::run(repo_path, action), + Commands::Validate { action } => validate_cmd::run(repo_path, action), + Commands::Reach { action } => reach_cmd::run(repo_path, action), + }; + + match result { + Ok(()) => ExitCode::SUCCESS, + Err(e) => { + eprintln!("error: {e}"); + ExitCode::FAILURE + } + } +} +``` + +- [ ] **Step 7.4: Create `build_cmd.rs`** + +Create `brit-build-ref/src/build_cmd.rs`: + +```rust +//! `brit build-ref build` subcommands. + +use std::path::Path; + +use brit_epr::elohim::attestation::build::BuildAttestationContentNode; +use brit_epr::engine::cid::BritCid; +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use brit_epr::engine::signing::AgentKey; +use brit_epr::elohim::refs::BritRefManager; +use clap::Subcommand; + +#[derive(Subcommand)] +pub enum BuildAction { + /// Record a build attestation. + Put { + /// Qualified step name. + #[arg(long)] + step: String, + /// CID of the build manifest. + #[arg(long)] + manifest: String, + /// CID of the output artifact. + #[arg(long)] + output: String, + /// Build succeeded. + #[arg(long, default_value_t = true)] + success: bool, + /// Hardware profile as JSON string. + #[arg(long, default_value = "{}")] + hardware: String, + /// Build duration in milliseconds. + #[arg(long, default_value_t = 0)] + duration_ms: u64, + /// Commit to associate (defaults to HEAD). + #[arg(long, default_value = "HEAD")] + commit: String, + }, + /// Read a build attestation. + Get { + /// Qualified step name. + #[arg(long)] + step: String, + /// Commit to read from (defaults to HEAD). + #[arg(long, default_value = "HEAD")] + commit: String, + }, + /// List build attestation steps. + List { + /// Filter pattern. + #[arg(long)] + step: Option, + }, +} + +pub fn run(repo_path: &Path, action: BuildAction) -> Result<(), Box> { + match action { + BuildAction::Put { + step, + manifest, + output, + success, + hardware, + duration_ms, + commit, + } => { + let git_dir = repo_path.join(".git"); + let store = LocalObjectStore::for_git_dir(&git_dir); + let key = AgentKey::load_or_generate(&git_dir.join("brit").join("agent-key"))?; + let refs = BritRefManager::new(repo_path)?; + + let manifest_cid: BritCid = manifest.parse()?; + let output_cid: BritCid = output.parse()?; + let hardware_profile: serde_json::Value = serde_json::from_str(&hardware)?; + + let now = chrono::Utc::now().to_rfc3339(); + + let mut node = BuildAttestationContentNode { + manifest_cid, + step_name: step.clone(), + inputs_hash: String::new(), // TODO: compute from manifest inputs + output_cid: output_cid.clone(), + agent_id: key.agent_id(), + hardware_profile, + build_duration_ms: duration_ms, + built_at: now, + success, + signature: String::new(), // filled after signing + }; + + // Sign the node (with empty signature field, then fill it) + let canonical = node.canonical_json()?; + node.signature = key.sign(&canonical); + + let cid = store.put(&node)?; + + // Write the ref + let ref_payload = serde_json::json!({ + "attestationCid": cid.as_str(), + "outputCid": output_cid.as_str(), + "agentId": node.agent_id, + "builtAt": node.built_at, + }); + refs.put_build_ref(&step, &commit, &ref_payload)?; + + println!("{cid}"); + Ok(()) + } + BuildAction::Get { step, commit } => { + let refs = BritRefManager::new(repo_path)?; + match refs.get_build_ref(&step, &commit)? { + Some(payload) => { + println!("{}", serde_json::to_string_pretty(&payload)?); + Ok(()) + } + None => { + eprintln!("no build attestation for step={step} at {commit}"); + Ok(()) + } + } + } + BuildAction::List { step } => { + let refs = BritRefManager::new(repo_path)?; + let steps = refs.list_build_refs(step.as_deref())?; + for s in steps { + println!("{s}"); + } + Ok(()) + } + } +} +``` + +- [ ] **Step 7.5: Create `deploy_cmd.rs`** + +Create `brit-build-ref/src/deploy_cmd.rs`: + +```rust +//! `brit build-ref deploy` subcommands. + +use std::path::Path; + +use brit_epr::elohim::attestation::deploy::{DeployAttestationContentNode, HealthStatus}; +use brit_epr::engine::cid::BritCid; +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use brit_epr::engine::signing::AgentKey; +use brit_epr::elohim::refs::BritRefManager; +use clap::Subcommand; + +#[derive(Subcommand)] +pub enum DeployAction { + /// Record a deploy attestation. + Put { + /// Qualified step name. + #[arg(long)] + step: String, + /// Environment label (alpha, staging, prod, self, or custom). + #[arg(long)] + env: String, + /// CID of the artifact being attested. + #[arg(long)] + artifact: String, + /// URL or service address. + #[arg(long)] + endpoint: String, + /// Health status: healthy, degraded, unreachable. + #[arg(long, default_value = "healthy")] + health: String, + /// Liveness TTL in seconds. + #[arg(long, default_value_t = 300)] + ttl: u64, + }, + /// Read a deploy attestation. + Get { + /// Qualified step name. + #[arg(long)] + step: String, + /// Environment label. + #[arg(long)] + env: String, + }, + /// List deploy attestation steps. + List { + /// Filter by step pattern. + #[arg(long)] + step: Option, + /// Filter by environment. + #[arg(long)] + env: Option, + }, +} + +pub fn run(repo_path: &Path, action: DeployAction) -> Result<(), Box> { + match action { + DeployAction::Put { + step, + env, + artifact, + endpoint, + health, + ttl, + } => { + let git_dir = repo_path.join(".git"); + let store = LocalObjectStore::for_git_dir(&git_dir); + let key = AgentKey::load_or_generate(&git_dir.join("brit").join("agent-key"))?; + let refs = BritRefManager::new(repo_path)?; + + let artifact_cid: BritCid = artifact.parse()?; + let health_status: HealthStatus = match health.as_str() { + "healthy" => HealthStatus::Healthy, + "degraded" => HealthStatus::Degraded, + "unreachable" => HealthStatus::Unreachable, + other => return Err(format!("unknown health status: {other}").into()), + }; + + let now = chrono::Utc::now().to_rfc3339(); + + let mut node = DeployAttestationContentNode { + artifact_cid: artifact_cid.clone(), + step_name: step.clone(), + environment_label: env.clone(), + endpoint: endpoint.clone(), + health_check_url: format!("{endpoint}/health"), + health_status, + deployed_at: now.clone(), + attested_at: now, + liveness_ttl_sec: ttl, + agent_id: key.agent_id(), + signature: String::new(), + }; + + let canonical = node.canonical_json()?; + node.signature = key.sign(&canonical); + + let cid = store.put(&node)?; + + let ref_payload = serde_json::json!({ + "artifactCid": artifact_cid.as_str(), + "attestationCid": cid.as_str(), + "healthStatus": health, + "attestedAt": node.attested_at, + "livenessTtlSec": ttl, + }); + refs.put_deploy_ref(&step, &env, &ref_payload)?; + + println!("{cid}"); + Ok(()) + } + DeployAction::Get { step, env } => { + let refs = BritRefManager::new(repo_path)?; + match refs.get_deploy_ref(&step, &env)? { + Some(payload) => { + println!("{}", serde_json::to_string_pretty(&payload)?); + Ok(()) + } + None => { + eprintln!("no deploy attestation for step={step} env={env}"); + Ok(()) + } + } + } + DeployAction::List { step, env: _ } => { + let refs = BritRefManager::new(repo_path)?; + let steps = refs.list_deploy_refs(step.as_deref())?; + for s in steps { + println!("{s}"); + } + Ok(()) + } + } +} +``` + +- [ ] **Step 7.6: Create `validate_cmd.rs`** + +Create `brit-build-ref/src/validate_cmd.rs`: + +```rust +//! `brit build-ref validate` subcommands. + +use std::path::Path; + +use brit_epr::elohim::attestation::validation::{ + ValidationAttestationContentNode, ValidationResult, +}; +use brit_epr::engine::cid::BritCid; +use brit_epr::engine::content_node::ContentNode; +use brit_epr::engine::object_store::LocalObjectStore; +use brit_epr::engine::signing::AgentKey; +use brit_epr::elohim::refs::BritRefManager; +use clap::Subcommand; + +#[derive(Subcommand)] +pub enum ValidateAction { + /// Record a validation attestation. + Put { + /// Qualified step name. + #[arg(long)] + step: String, + /// Check name (e.g., sonarqube-scan@v10). + #[arg(long)] + check: String, + /// CID of the artifact validated. + #[arg(long)] + artifact: String, + /// Result: pass, fail, warn, skip. + #[arg(long)] + result: String, + /// Human-readable summary. + #[arg(long, default_value = "")] + summary: String, + }, + /// Read a validation attestation. + Get { + /// Qualified step name. + #[arg(long)] + step: String, + /// Check name. + #[arg(long)] + check: String, + }, + /// List validation attestation steps. + List { + /// Filter by step pattern. + #[arg(long)] + step: Option, + /// Filter by check pattern. + #[arg(long)] + check: Option, + }, +} + +pub fn run(repo_path: &Path, action: ValidateAction) -> Result<(), Box> { + match action { + ValidateAction::Put { + step, + check, + artifact, + result, + summary, + } => { + let git_dir = repo_path.join(".git"); + let store = LocalObjectStore::for_git_dir(&git_dir); + let key = AgentKey::load_or_generate(&git_dir.join("brit").join("agent-key"))?; + let refs = BritRefManager::new(repo_path)?; + + let artifact_cid: BritCid = artifact.parse()?; + let validation_result = match result.as_str() { + "pass" => ValidationResult::Pass, + "fail" => ValidationResult::Fail, + "warn" => ValidationResult::Warn, + "skip" => ValidationResult::Skip, + other => return Err(format!("unknown result: {other}").into()), + }; + + let now = chrono::Utc::now().to_rfc3339(); + + let mut node = ValidationAttestationContentNode { + artifact_cid: artifact_cid.clone(), + check_name: check.clone(), + validator_id: key.agent_id(), + validator_version: "0.0.0".into(), + result: validation_result, + result_summary: summary, + findings_cid: None, + validated_at: now, + ttl_sec: None, + signature: String::new(), + }; + + let canonical = node.canonical_json()?; + node.signature = key.sign(&canonical); + + let cid = store.put(&node)?; + + let ref_payload = serde_json::json!({ + "artifactCid": artifact_cid.as_str(), + "attestationCid": cid.as_str(), + "result": result, + "validatedAt": node.validated_at, + }); + refs.put_validate_ref(&step, &check, &ref_payload)?; + + println!("{cid}"); + Ok(()) + } + ValidateAction::Get { step, check } => { + let refs = BritRefManager::new(repo_path)?; + match refs.get_validate_ref(&step, &check)? { + Some(payload) => { + println!("{}", serde_json::to_string_pretty(&payload)?); + Ok(()) + } + None => { + eprintln!("no validation attestation for step={step} check={check}"); + Ok(()) + } + } + } + ValidateAction::List { step, check: _ } => { + let refs = BritRefManager::new(repo_path)?; + let steps = refs.list_validate_refs(step.as_deref())?; + for s in steps { + println!("{s}"); + } + Ok(()) + } + } +} +``` + +- [ ] **Step 7.7: Create `reach_cmd.rs`** + +Create `brit-build-ref/src/reach_cmd.rs`: + +```rust +//! `brit build-ref reach` subcommands. + +use std::path::Path; + +use brit_epr::elohim::attestation::reach::{compute_reach, ReachInput}; +use brit_epr::elohim::refs::BritRefManager; +use clap::Subcommand; + +#[derive(Subcommand)] +pub enum ReachAction { + /// Compute reach level from current attestations and write the ref. + Compute { + /// Qualified step name. + #[arg(long)] + step: String, + }, + /// Read the current reach level for a step. + Get { + /// Qualified step name. + #[arg(long)] + step: String, + }, +} + +pub fn run(repo_path: &Path, action: ReachAction) -> Result<(), Box> { + let refs = BritRefManager::new(repo_path)?; + + match action { + ReachAction::Compute { step } => { + // Collect attestation data from refs + let build_agents = refs.list_build_refs(Some(&step))?; + let deploy_envs = refs.list_deploy_refs(Some(&step))?; + let validate_checks = refs.list_validate_refs(Some(&step))?; + + let input = ReachInput { + build_attestations: build_agents, + deploy_attestations: deploy_envs, + validation_attestations: validate_checks, + }; + + let level = compute_reach(&input); + + let payload = serde_json::json!({ + "stepName": step, + "computedReach": level, + "buildAttestations": input.build_attestations.len(), + "deployAttestations": input.deploy_attestations.len(), + "validationAttestations": input.validation_attestations.len(), + }); + + refs.put_reach_ref(&step, &payload)?; + + println!("{}", serde_json::to_string_pretty(&payload)?); + Ok(()) + } + ReachAction::Get { step } => { + match refs.get_reach_ref(&step)? { + Some(payload) => { + println!("{}", serde_json::to_string_pretty(&payload)?); + Ok(()) + } + None => { + eprintln!("no reach level computed for step={step}"); + Ok(()) + } + } + } + } +} +``` + +- [ ] **Step 7.8: Add chrono dependency to brit-build-ref** + +Edit `brit-build-ref/Cargo.toml`, add to `[dependencies]`: + +```toml +chrono = { version = "0.4", default-features = false, features = ["clock"] } +``` + +- [ ] **Step 7.9: Build the binary** + +Run: + +``` +cargo build -p brit-build-ref +``` + +Expected: compiles. If API mismatches with `chrono::Utc::now()`, check the chrono features include `clock`. + +- [ ] **Step 7.10: End-to-end smoke test** + +In the brit submodule workspace, run a full build→deploy→validate→reach cycle: + +```bash +# Create a temp repo for testing +SMOKE_DIR=$(mktemp -d) +git init "$SMOKE_DIR" --initial-branch=main +git -C "$SMOKE_DIR" -c user.email=test@test.com -c user.name=test commit --allow-empty -m "init" + +# Fake CIDs (64-char hex) +MANIFEST_CID=$(printf '%064d' 1) +OUTPUT_CID=$(printf '%064d' 2) +ARTIFACT_CID=$OUTPUT_CID + +# Build attestation +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" build put \ + --step elohim-edge:storage --manifest "$MANIFEST_CID" --output "$OUTPUT_CID" + +# Deploy attestation +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" deploy put \ + --step elohim-edge:storage --env staging --artifact "$ARTIFACT_CID" \ + --endpoint https://staging.elohim.host + +# Validate attestation +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" validate put \ + --step elohim-edge:storage --check sonarqube-scan@v10 \ + --artifact "$ARTIFACT_CID" --result pass --summary "0 bugs" + +# Compute reach +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" reach compute \ + --step elohim-edge:storage + +# Read back +echo "--- Build ---" +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" build get --step elohim-edge:storage +echo "--- Deploy ---" +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" deploy get --step elohim-edge:storage --env staging +echo "--- Validate ---" +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" validate get --step elohim-edge:storage --check sonarqube-scan@v10 +echo "--- Reach ---" +cargo run -p brit-build-ref -- --repo "$SMOKE_DIR" reach get --step elohim-edge:storage + +# Cleanup +rm -rf "$SMOKE_DIR" +``` + +Expected: each command prints a CID or JSON payload. Reach should show `"computedReach": "verified"`. + +- [ ] **Step 7.11: Commit** + +``` +git add brit-build-ref/ Cargo.toml Cargo.lock +git commit -m "feat(brit-build-ref): CLI for build/deploy/validate/reach attestation refs + +Full brit build-ref command group: put/get/list for build, deploy, and +validation attestations; compute/get for derived reach. Agent key auto- +generated on first use. ContentNodes stored in .git/brit/objects/, +indexed via refs/notes/brit/. End-to-end smoke-tested." +``` + +--- + +## Task 8: Feature flag enforcement and engine-only build check + +**Files:** +- Modify: `brit-epr/src/lib.rs` (add attestation re-exports) + +- [ ] **Step 8.1: Add attestation re-exports to lib.rs** + +Edit `brit-epr/src/lib.rs` — add to the feature-gated re-exports: + +```rust +// Feature-gated re-exports +#[cfg(feature = "elohim-protocol")] +pub use elohim::{ + parse_pillar_trailers, validate_pillar_trailers, ElohimProtocolSchema, PillarTrailers, + PillarValidationError, TrailerKey, +}; + +// Re-export attestation types for convenience +#[cfg(feature = "elohim-protocol")] +pub mod attestation { + //! Convenience re-exports for attestation types. + pub use crate::elohim::attestation::build::BuildAttestationContentNode; + pub use crate::elohim::attestation::deploy::{DeployAttestationContentNode, HealthStatus}; + pub use crate::elohim::attestation::reach::{compute_reach, ReachInput, ReachLevel}; + pub use crate::elohim::attestation::validation::{ + ValidationAttestationContentNode, ValidationResult, + }; + pub use crate::elohim::refs::BritRefManager; +} +``` + +- [ ] **Step 8.2: Verify engine-only build** + +Run: + +``` +cargo build -p brit-epr --no-default-features +``` + +Expected: compiles. The engine (CID, ContentNode trait, ObjectStore, signing) builds without attestation types. + +- [ ] **Step 8.3: Run all tests** + +Run: + +``` +cargo test -p brit-epr +``` + +Expected: all tests pass. Count should be approximately: +- Phase 0+1: 9 (engine_parsing: 2, elohim_parse: 3, elohim_validate: 4) +- Task 1 CID: 5 +- Task 2 ObjectStore: 4 +- Task 3 Signing: 4 +- Task 4 Attestation roundtrip: 5 +- Task 5 Ref management: 5 +- Task 6 Reach: 5 +- Total: ~37 + +- [ ] **Step 8.4: Commit** + +``` +git add brit-epr/src/lib.rs +git commit -m "feat(brit-epr): add attestation convenience re-exports + +Re-export attestation types at crate root behind elohim-protocol +feature flag. Engine-only build verified: --no-default-features still +compiles." +``` + +--- + +## Task 9: Move design doc and bump submodule pointer + +**Files:** +- Add: `docs/plans/phases/phase-2a-build-attestation-primitives.md` (the recovered design doc, now in the submodule) +- Modify: `docs/plans/README.md` (add Phase 2a to the roadmap table) +- Then in parent monorepo: bump submodule pointer + +- [ ] **Step 9.1: Verify the design doc is in place** + +``` +ls -la docs/plans/phases/phase-2a-build-attestation-primitives.md +``` + +Expected: file exists (copied in earlier step). + +- [ ] **Step 9.2: Update the roadmap README** + +Edit `docs/plans/README.md` — update the phased decomposition table. Add Phase 2a between Phase 1 and Phase 2: + +Find the line: +``` +| **2** | ContentNode adapter | +``` + +Add before it: +``` +| **2a** | Build attestation primitives | `BuildAttestationContentNode`, `DeployAttestationContentNode`, `ValidationAttestationContentNode` schemas + `brit build-ref` CLI + ref namespace under `refs/notes/brit/`. Pure local — no DHT, no P2P. | **Plan: [2026-04-16-phase-2a-build-attestation-primitives.md](./2026-04-16-phase-2a-build-attestation-primitives.md)** | +``` + +Also update Phase 0+1 status to "**Done**" if not already. + +- [ ] **Step 9.3: Commit in submodule** + +``` +git add docs/plans/phases/ docs/plans/README.md docs/plans/2026-04-16-phase-2a-build-attestation-primitives.md +git commit -m "docs(plans): add Phase 2a design doc and implementation plan + +Recovered Phase 2a design doc (build attestation primitives) placed +in docs/plans/phases/. Implementation plan covers all tasks from +engine foundation through CLI and feature flag enforcement. Roadmap +README updated with Phase 2a entry." +``` + +- [ ] **Step 9.4: Switch to parent monorepo and bump** + +``` +cd /projects/elohim +git add elohim/brit +git commit -m "chore(brit): bump submodule to Phase 2a attestation primitives + +Advances the brit submodule pointer to include Phase 2a: three +attestation ContentNode schemas, LocalObjectStore, agent signing, +git ref namespace management under refs/notes/brit/, and the +brit-build-ref CLI." +``` + +- [ ] **Step 9.5: Report back** + +``` +Phase 2a plan complete. Summary: + + - Engine foundation: BritCid (blake3), ContentNode trait, LocalObjectStore, + AgentKey (ed25519 signing). + - Three attestation schemas: BuildAttestationContentNode, + DeployAttestationContentNode, ValidationAttestationContentNode. + - Git ref management under refs/notes/brit/ for build/deploy/validate/reach. + - Deterministic reach computation: Unknown → Built → Deployed → Verified. + - brit-build-ref CLI with full put/get/list for all attestation types. + - ~37 tests covering roundtrip, CID determinism, signing, ref ops, reach. + - Engine-only build verified (--no-default-features). + - Design doc and implementation plan committed to submodule. + - Submodule pointer bumped in parent monorepo. + +Ready to push both repos. Waiting for confirmation. +``` + +--- + +## Self-Review + +**Spec coverage:** +- ✅ `BuildAttestationContentNode` schema with all fields from spec (Task 4) +- ✅ `DeployAttestationContentNode` schema with all fields from spec (Task 4) +- ✅ `ValidationAttestationContentNode` schema with check vocabulary (Task 4) +- ✅ `brit build-ref build put/get/list` CLI (Task 7) +- ✅ `brit build-ref deploy put/get/list` CLI (Task 7) +- ✅ `brit build-ref validate put/get/list` CLI (Task 7) +- ✅ `brit build-ref reach compute/get` CLI (Task 7) +- ✅ Ref namespace under `refs/notes/brit/` (Task 5) +- ✅ Serde roundtrip for all three types (Task 4) +- �� CID determinism (Task 1, Task 4) +- ✅ Agent signing on every `put` (Task 3, Task 7) +- ✅ `elohim-protocol` feature flag enforcement (Task 8) +- ✅ Engine compiles with `--no-default-features` (Task 8) +- ✅ Reach computation is deterministic (Task 6) + +**Spec acceptance criteria check:** +- ✅ "All three schemas round-trip through serialize/deserialize" — Task 4 tests +- ⚠️ "with ts-rs generation" — NOT covered. ts-rs is a codegen concern; Phase 2a can add it as a follow-up task after the types stabilize. The types are ts-rs-compatible (all serde-serializable with camelCase). +- ✅ "`brit build-ref build put/get/list` work on a fresh git repo" — Task 7 smoke test +- ✅ "Refs written by `brit build-ref` are visible via `git notes`" — Task 5 uses git notes for build refs +- ⚠️ "Refs survive `git clone --bare` + `git fetch refs/notes/*`" — not explicitly tested but the ref layout is designed for this. Add a follow-up integration test. +- ✅ "Engine compiles with `--no-default-features`" — Task 8 Step 8.2 +- ✅ "Reach computation is deterministic" — Task 6 test +- ⚠️ "Check vocabulary registration is enforced: unregistered checkName values rejected" — NOT covered in Phase 2a CLI. The spec says check names are governed by AppManifest, which doesn't exist yet. The validation type stores whatever check name is given. Add enforcement when AppManifest lands. + +**Placeholder scan:** None. Every step has actual code, actual commands, actual expected output. + +**Type consistency:** `BritCid` used identically across all modules. `ContentNode` trait implemented by all three attestation types. `AgentKey` used consistently in all CLI put commands. `BritRefManager` used consistently across CLI and tests. `ReachLevel` enum used in both computation and CLI output. + +**Deferred to later phases:** +- ts-rs TypeScript generation → after types stabilize +- Check vocabulary enforcement → when AppManifest exists +- `git clone --bare` + fetch integration test → follow-up +- Full multiformats CIDv1 → when IPFS/Holochain interop needed +- DHT publication of attestations → Phase 5 +- Economic event emission → shefa integration phase diff --git a/docs/plans/README.md b/docs/plans/README.md new file mode 100644 index 00000000000..1eb334db970 --- /dev/null +++ b/docs/plans/README.md @@ -0,0 +1,88 @@ +# Brit — EPR-Applied Git Roadmap + +**Brit** (בְּרִית, "covenant") is an expansion of [gitoxide](https://github.com/GitoxideLabs/gitoxide) that makes version control covenantal. Every commit carries three-pillar metadata (lamad/shefa/qahal). Merges are covenantal joinings of lineages. Forks are new covenants, legitimately grown from old ones. Branches carry reach levels that govern who sees what. + +## Why Fork gitoxide + +gitoxide is a pure-Rust git implementation with a clean modular design — each concern lives in its own `gix-*` crate and swaps independently. That modularity lets us layer protocol semantics onto git without rewriting the object model. The engine/app-schema split (§2 of the schema manifest) keeps brit-epr usable as a generic substrate while the Elohim Protocol vocabulary stays behind a feature flag. + +## Architecture + +Brit is the **semantic layer** (commits, refs, pillar coupling, governance) over two substrates: +- **gitoxide** — the git object model, refs, pack protocol +- **rust-ipfs** — CIDs, multihash, Bitswap, libp2p (Phase 2+) + +The hybrid design: commit **trailers** are the protocol surface (survive `git clone` from any forge); linked **ContentNodes** are the graph surface (rich pillar metadata, resolved through doorway). + +See [composition.md](../composition.md) for how brit composes with rakia, protocol schemas, and storage. + +## Phase Decomposition + +Seven phases, decomposed by **protocol capability** — what the protocol gains, not what crate gets written. Each phase produces working, testable software. + +| Phase | Capability unlocked | What the protocol gains | Status | +|---|---|---|---| +| **0+1** | **Commits carry covenant** | Every commit can be checked for three-pillar compliance. Trailers parse, validate, round-trip through stock git. `brit verify` works. | **Complete** ([plan](./2026-04-11-phase-0-epr-trailer-foundation.md)) | +| **2a** | **Artifacts become self-aware** | Build, deploy, and validation attestations as signed ContentNodes. Refs under `refs/notes/brit/`. Reach computation. `brit build-ref` CLI. Pure local. | **Complete** ([plan](./2026-04-16-phase-2a-build-attestation-primitives.md)) | +| **2** | **Git artifacts become protocol content** | Repos, commits, branches, trees addressable by CID. Schema-driven types. Rakia can emit BuildManifestContentNode. | [Summary](phases/phase-2-contentnode-adapter.md) | +| **3** | **Git over P2P** | Clone, fetch, push over libp2p. No forge required. Rakia-peer shares transport. | [Summary](phases/phase-3-libp2p-transport.md) | +| **4** | **Branches tell their story** | Per-branch READMEs as EPRs. Reach-governed visibility. Build status per branch. | [Summary](phases/phase-4-branch-readmes.md) | +| **5** | **Repos discoverable on the network** | DHT announcement. Peers find repos by CID. Build manifests discoverable. | [Summary](phases/phase-5-dht-discovery.md) | +| **6** | **Forks are governance acts** | Fork as ContentNode with stewardship. Cross-fork merges via qahal consent. | [Summary](phases/phase-6-fork-governance.md) | + +### Parallel evolution with rakia + +Brit and [rakia](https://github.com/ethosengine/rakia) evolve on parallel tracks. Each brit phase unlocks rakia capability, but neither blocks the other: + +| Brit phase | What rakia gains | +|---|---| +| Phase 0+1 (current) | Attestation trailers. Change detection via gix. Baseline refs. | +| Phase 2 | BuildManifestContentNode + BuildAttestationContentNode as protocol content | +| Phase 3 | Shared P2P transport for manifest distribution | +| Phase 4 | Build status per branch via reach-governed branch metadata | +| Phase 5 | Build manifest discovery via DHT | +| Phase 6 | Forked build recipes with independent stewardship | + +## Design Principles + +1. **Round-trip with stock git.** Every commit brit produces must be readable by stock git. Trailers are RFC-822 lines, not magic bytes. `git clone` from any forge works. + +2. **Schema-driven development.** JSON Schema files define all ContentNode types, trailer grammar, and enum vocabularies. Rust types are generated from schemas. Validation harness catches drift. + +3. **Engine doesn't know the protocol.** brit-epr parses trailers and dispatches to `AppSchema`. The Elohim Protocol vocabulary is behind `#[cfg(feature = "elohim-protocol")]`. Someone can write `AcmeSchema` for carbon accounting without touching brit. + +4. **Brit doesn't own governance.** Brit reads consent requirements from the parent EPR's qahal context. The governance gateway handles tally logic. Brit publishes proposals and executes results. + +5. **Upstream-rebaseable.** New functionality goes in new crates. Modifications to `gix-*` crates are limited to bugs and additive extension points. The fork doesn't become a boat anchor. + +6. **Additive, not destructive.** Phase 0 doesn't rename `gix-*` to `brit-*`. We earn renames only when semantics diverge enough to justify the churn. + +7. **LLM-first CLI.** Command names mirror git (use training as cognitive carrier). Hard parts of pillar authoring pushed into skill + template, not into the prompt. Humans use the UI for review and consent. + +## Loops This Closes + +| Loop | How brit closes it | +|---|---| +| **Build baselines** | Pipeline baselines become git refs (`refs/notes/rakia/baselines`), not Jenkins artifacts | +| **Build artifacts** | CI outputs become CID-addressed, attested ContentNodes via rakia | +| **Schema versioning** | Every schema version is a commit, every evolution is a branch, N versions coexist | +| **Journal publishing** | Journal entries become commits to stewarded refs | +| **Attestation** | Agent-signed reviews are commit trailers with capability claims | +| **Fork governance** | Forks are ContentNodes with their own stewardship and governance | +| **Branch views** | Branches are ContentNodes with reach, audience, and per-branch READMEs | + +## Key Documents + +| Document | Purpose | +|---|---| +| [Schema manifest](../schemas/elohim-protocol-manifest.md) | 1700+ line exploration of the full app schema: ContentNode catalog, trailer spec, CLI surface, signals, doorway registration | +| [Design spec](../specs/2026-04-12-brit-design.md) | Formal spec with architecture, design decisions, and open questions | +| [Composition model](../composition.md) | How brit composes with rakia, protocol schemas, rust-ipfs, storage | +| [Merge consent critique](../schemas/reviews/2026-04-11-merge-consent-critique.md) | Design review of async-default merge consent | +| [Phase 0+1 plan](./2026-04-11-phase-0-epr-trailer-foundation.md) | Implementation plan (complete) | + +## How to Read the Plans + +Phase plans in `phases/` are summary documents with vision, prerequisites, sprint sketches, and risks. Each gets a full implementation plan (in this directory) before execution. + +Implementation plans follow the [superpowers writing-plans skill](https://github.com/obra/superpowers) format: bite-sized tasks, TDD-first, implementation-ready for someone with Rust experience and zero prior context. diff --git a/docs/plans/phases/phase-2-contentnode-adapter.md b/docs/plans/phases/phase-2-contentnode-adapter.md new file mode 100644 index 00000000000..5bc8f053e5b --- /dev/null +++ b/docs/plans/phases/phase-2-contentnode-adapter.md @@ -0,0 +1,54 @@ +# Phase 2: Git Artifacts Become ContentNodes + +**Status:** Needs design session +**Depends on:** Phase 0+1 complete (trailer parsing + validation) +**Capability unlocked:** Repos, commits, branches, trees, and blobs are addressable protocol content. Rakia can emit BuildManifestContentNode and BuildAttestationContentNode. + +## Vision + +A git repository, viewed through brit, is not just a DAG of commits — it's a constellation of ContentNodes in the protocol's content graph. Each commit, branch, tree, and blob has a CID. Each carries three-pillar metadata. The doorway can serve any of them to any protocol participant. + +This is the phase where brit stops being "git with trailer discipline" and becomes "the version control surface of a distributed knowledge network." + +## What Changes + +- `RepoContentNode`, `CommitContentNode`, `BranchContentNode`, `TreeContentNode`, `BlobContentNode` types implemented in `brit-epr` +- Export adapter: git repo -> ContentNodes (serialize to DAG-CBOR, produce CIDs). Source of truth: git object store. ContentNodes are projections of git objects, not replacements. +- Import adapter: ContentNodes -> git objects (for repos cloned via P2P, not just git). Source of truth for imported repos: the DHT-notarized ContentNodes until a local git object store is populated. +- The elohim monorepo itself is imported as a test — the first brit-native repo +- JSON Schema files for all ContentNode types written and validated +- Rust types generated from schemas (codegen pipeline established) +- `schema_contract.rs` validation harness (mirrors elohim-storage pattern) + +## What This Unlocks for Rakia + +- `BuildManifestContentNode` can be implemented — rakia's Sprint 6 ("The Manifest Becomes a ContentNode") depends on this adapter +- `BuildAttestationContentNode` can be stored and retrieved as protocol content +- Manifest CIDs are computable — rakia can content-address its build manifests + +## What This Unlocks for the Protocol + +- Repos are addressable by CID — "give me this repo" works regardless of which peer hosts it. Source of truth: the git object store, projected as ContentNodes into the DHT. +- Commits are linked to their ContentNode representations — the doorway can serve rich commit views +- Branches resolve to ContentNodes with reach, governance, and purpose metadata. Source of truth for reach/governance: the DHT-notarized BranchContentNode. Source of truth for branch content: the git ref. + +## Prerequisites + +- Phase 0+1 complete (trailer parsing, PillarTrailers struct, brit-verify) +- rust-ipfs available for DAG-CBOR serialization and CID computation (may require adding as submodule) +- Protocol schemas for all brit ContentNode types finalized (source of truth: `elohim/sdk/schemas/v1/`, vendored into brit's `schemas/elohim-protocol/v1/`) + +## Sprint Sketch (to be decomposed in design session) + +1. **Schema first** — JSON Schema files for all 12+ ContentNode types, codegen pipeline, validation harness +2. **Core adapter** — RepoContentNode + CommitContentNode: serialize git repo/commit to DAG-CBOR, produce CIDs +3. **Tree/blob adapter** — TreeContentNode + BlobContentNode: file-level content addressing +4. **Branch/ref adapter** — BranchContentNode with reach, protection rules, per-branch README slot +5. **Reserved types** — BuildManifestContentNode + BuildAttestationContentNode: implement the reserved slots from §5.12 +6. **Import test** — import the elohim monorepo as the first brit-native repo; validate round-trip + +## Risks + +- **DAG-CBOR canonical serialization** must produce stable CIDs. Two implementations computing a CID for the same commit must get the same result. This needs a canonicalization spec before implementation. +- **rust-ipfs integration depth** — how much of rust-ipfs does brit need? Just DAG-CBOR + CID computation, or also blockstore? Decision affects submodule timing. +- **Schema evolution** — once ContentNode types are published and have CIDs in the wild, changing the schema changes the CIDs. Schema versioning follows the P2P DAG model (source of truth: the CID-addressed schema version itself; N versions coexist; migrations compose along paths). Versioning strategy needed before publishing. diff --git a/docs/plans/phases/phase-2a-build-attestation-primitives.md b/docs/plans/phases/phase-2a-build-attestation-primitives.md new file mode 100644 index 00000000000..de236c273a0 --- /dev/null +++ b/docs/plans/phases/phase-2a-build-attestation-primitives.md @@ -0,0 +1,194 @@ +# Phase 2a — Build Attestation Primitives + +**Status:** Design +**Date:** 2026-04-13 +**Depends on:** Phase 0 (EPR trailer foundation), Phase 2 (ContentNode adapter) +**Independent of:** Phases 3–6 (libp2p, branch READMEs, DHT discovery, fork governance) — this phase is pure local, requires no networking +**Consumers:** rakia (see `elohim/rakia/docs/plans/build-attestation-integration.md`) + +## Problem + +Build and deployment state today live in executor memory (Jenkins JSON artifacts, `build-state.json`). When the executor fails mid-run, the state either: + +- **Leapfrogs** — advances past unbuilt changes, poisoning future change detection +- **Gets lost** — falls back to a stale global baseline, triggering full rebuilds +- **Says nothing about deployment** — CI knows what it triggered, not what's actually running + +The artifact itself has no way to answer "was I built?" / "am I deployed?" / "how trusted am I?" without consulting external systems that can disagree or disappear. + +## Insight + +These are protocol primitives, not CI infrastructure. An artifact's build and deployment state belongs in the same covenant structure that brit already applies to commits: content-addressed, peer-attested, carrying pillar coupling, survivable across executor death. + +**The artifact becomes self-aware** through peer attestations in the git namespace plus the DHT. Any participant can compute what needs to build, deploy, or promote without asking Jenkins, without reading a fragile JSON file, without trusting a central registry. + +Succession and stewardship are innate: if the primary steward is unavailable, the collective's succession order kicks in. Stewardship credit flows through the REA shefa pillar for every attestation produced. The bus factor dissolves. + +## Scope of This Phase + +Brit adds three ContentNode types and ref-management CLI commands. Pure local operations (schema + refs + git). No DHT, no P2P, no remote peers. The DHT publication path opens in a later phase. + +**In scope:** +- `BuildAttestationContentNode` schema +- `DeployAttestationContentNode` schema +- `ValidationAttestationContentNode` schema (with check vocabulary) +- `brit build-ref` CLI (read/write/list for all three attestation types) +- Ref namespace design under `refs/notes/brit/` +- AppSchema registration (these are elohim-protocol-flagged types) + +**Out of scope (future phases):** +- DHT publication of attestations (future phase, composes with Phase 5 DHT discovery) +- Cross-peer attestation reconciliation (future phase, composes with Phase 3 libp2p transport) +- Reach promotion rule DSL (future phase, tied to AppManifest work) +- Economic event emission on attestation (shefa integration, separate phase) + +## Schemas + +### `BuildAttestationContentNode` + +Records that an agent produced an output artifact from a manifest's inputs. + +| Field | Type | Description | +|---|---|---| +| `manifestCid` | CID | The BuildManifestContentNode this attestation is for | +| `stepName` | string | Qualified step name (e.g., `elohim-edge:cargo-build-storage`) | +| `inputsHash` | string | Content hash of all declared inputs at build time | +| `outputCid` | CID | Content-addressed output artifact | +| `agentId` | AgentPubKey | Peer that performed the build | +| `hardwareProfile` | object | CPU arch, OS, memory, relevant toolchain versions | +| `buildDurationMs` | number | Wall-clock build time | +| `builtAt` | timestamp | When the build completed | +| `success` | boolean | Did the build succeed | +| `signature` | bytes | Agent's signature over the full payload | + +**Pillar coupling:** +- Lamad: `build-knowledge` — what was built, from what, how +- Shefa: `compute-expended` — the economic cost of producing it +- Qahal: `build-authority` — agent's right to attest this artifact + +### `DeployAttestationContentNode` + +Records that an agent confirms an artifact is live at an environment. + +| Field | Type | Description | +|---|---|---| +| `artifactCid` | CID | The output CID being attested | +| `stepName` | string | Which step's artifact this is | +| `environmentLabel` | string | `alpha`, `staging`, `prod`, `self`, or custom | +| `endpoint` | string | URL or service address being verified | +| `healthCheckUrl` | string | Endpoint used to verify liveness | +| `healthStatus` | enum | `healthy`, `degraded`, `unreachable` | +| `deployedAt` | timestamp | When the artifact started serving here | +| `attestedAt` | timestamp | When this attestation was produced | +| `livenessTtlSec` | number | After this many seconds without re-attestation, the claim self-invalidates | +| `agentId` | AgentPubKey | Peer producing the attestation | +| `signature` | bytes | | + +**Pillar coupling:** +- Lamad: `deployment-knowledge` — what is running where +- Shefa: `serving-compute` — the cost of hosting/serving +- Qahal: `environment-authority` — agent's right to attest this environment + +### `ValidationAttestationContentNode` + +Records that a validator (tool or agent) applied a named check to an artifact. + +| Field | Type | Description | +|---|---|---| +| `artifactCid` | CID | What was validated | +| `checkName` | string | Registered check identifier (e.g., `sonarqube-scan@v10`, `trivy-cve@latest`, `nist-800-53`, `test-suite-vitest`, `code-review`) | +| `validatorId` | string | Tool identity or agent pubkey | +| `validatorVersion` | string | Version of the tool/agent | +| `result` | enum | `pass`, `fail`, `warn`, `skip` | +| `resultSummary` | string | Human-readable summary | +| `findingsCid` | CID \| null | Optional detailed report | +| `validatedAt` | timestamp | | +| `ttlSec` | number \| null | When validation goes stale (e.g., CVE DB refresh interval) | +| `signature` | bytes | | + +**Check vocabulary is governed by the AppManifest.** A check is only recognized if its `checkName` is registered in the current manifest version. This lets the community evolve the vocabulary — add new scanners, retire outdated ones — without protocol changes. + +**Pillar coupling:** +- Lamad: `validation-knowledge` — the findings +- Shefa: `verification-compute` — cost of running the check +- Qahal: `validation-authority` — community's recognition that this check counts + +## Ref Namespace + +All attestation refs live under `refs/notes/brit/` to stay within git's notes convention and survive clone/fetch: + +| Ref | Contents | +|---|---| +| `refs/notes/brit/build/{stepName}` | JSON: `{commit: {attestationCid, outputCid, agentId, builtAt}}` — most recent build attestation per commit | +| `refs/notes/brit/deploy/{stepName}/{env}` | JSON: `{artifactCid, attestationCid, healthStatus, attestedAt, livenessTtlSec}` | +| `refs/notes/brit/validate/{stepName}/{checkName}` | JSON: `{artifactCid, attestationCid, result, validatedAt, ttlSec}` | +| `refs/notes/brit/reach/{stepName}` | JSON: `{artifactCid, computedReach, contributingAttestations: [...]}` — derived, rebuildable from above | + +**The refs are a cache; the ContentNodes are truth.** When DHT publication lands (composes with Phase 5), attestations publish across the network; refs become projections. For this phase, the ContentNode is stored locally in `.git/brit/objects/` and the ref points to its CID — the same structure brit already uses for commit-level EPR trailers. + +## CLI + +`brit build-ref` command group: + +``` +brit build-ref build put --step --manifest --output [--success] [--hardware ] +brit build-ref build get --step [--commit ] +brit build-ref build list [--step ] + +brit build-ref deploy put --step --env