Build Dashboards and Quality Gates#
Use this guide in repositories that consume docs-as-code as a Bazel dependency.
Goals:
Publish traceability dashboards from repository needs.
Export machine-readable metrics.
Enforce CI thresholds with
traceability_gate.
What You Get#
With the docs(...) macro and score_metamodel extension enabled, your
repository can:
build an HTML dashboard from its own Sphinx needs,
include external needs from other repositories when desired,
export
needs.jsonandmetrics.jsonfor machine-readable reporting,gate CI on traceability thresholds via
traceability_gate.
Typical Setup#
For details, see Setup.
Minimal Configuration Example#
In docs/conf.py:
score_metamodel_requirement_types = "feat_req,comp_req,aou_req"
score_metamodel_include_external_needs = False
Use score_metamodel_include_external_needs = True only in repositories that
intentionally aggregate requirements across module dependencies, such as
integration repositories. Use False for module repositories to gate only on
local traceability.
Building the Dashboard#
After building/running any docs command (i.e. bazel build //:needs_json or bazel run //:docs_check are the fastest):
The documentation build writes metrics.json via score_metamodel, and the needs_json artifact contains:
bazel-bin/needs_json/_build/needs/needs.jsonbazel-bin/needs_json/_build/needs/metrics.json
The dashboard charts and the CI gate both use the same computed metrics.
Inputs for Linkage Metrics#
To get meaningful dashboard and gate values, consumer repositories typically need three inputs:
Requirement and architecture needs in the documentation itself.
Source code references via Reference Docs in Source Code.
Test metadata via Reference Docs in Tests.
If one of those inputs is missing, the related chart or gate metric will remain empty or low.
Choosing Local vs Aggregated Views#
There are two common modes:
Module repository
Set
score_metamodel_include_external_needs = False.Gate only on the needs owned by the repository itself.
Use this for per-module implementation progress and traceability.
Integration repository
Set
score_metamodel_include_external_needs = True.Aggregate requirements across module dependencies when that is the intended repository purpose.
Use this for system or integration-level dashboards.
CI Quality Gate#
Any docs build (bazel run //:docs, bazel run //:docs_check, etc.)
writes metrics.json alongside the build output. Run the gate on the
exported metrics:
bazel run //:docs && \
bazel run //:traceability_gate -- \
--metrics-json bazel-bin/needs_json/_build/needs/metrics.json \
--min-req-code 70 \
--min-req-test 70 \
--min-req-fully-linked 60 \
--min-tests-linked 70
In CI, wire targets through Bazel dependencies so test execution and docs generation happen before the gate target.
In larger repositories, define a dedicated wrapper target for your standard gate thresholds so CI calls a single Bazel target.
Useful flags:
--require-all-linksfor strict 100 percent gating
Recommended Rollout#
For a new consumer repository:
Start with local-only metrics.
Enable
scan_codeand verifysource_code_linkcoverage first.Add test metadata and verify
testlinkcoverage.Introduce modest thresholds in CI.
Raise thresholds over time as the repository matures.