Experimental

Service-reports examples — provenance

Reference examples for the service-reports cell. Each example demonstrates the canonical structure of its genre against the cell’s composition rules. Like the slides cell’s examples/, each entry has a documented origin so reviewers can judge whether the demonstrated patterns map to a real engagement shape or are fictional illustrations only.

Provenance discipline

Why this matters. Reference examples teach the cell more effectively when they sit in the context of real engagements, not invented ones. But public-package PII discipline forbids customer reconnaissance data from shipping. The bridge is engagement-shape preservation with full identity substitution: keep the industry, scope, threshold anchor, methodology rigor, and finding shape; replace every identifying string with a fictional placeholder.

Anonymization checklist (apply to every Lightfield-derived or real-customer-derived example)

  1. Customer name → fictional placeholder (industry-coherent but searchably unaffiliated with any real company in the same space). Service reports default to telecom-shaped placeholders (“Apex Telecom”, “Acme Telco”) for telecom diagnostics; pick the placeholder consistent with future examples in the cell.
  2. Site nicknames / facility codes → generic descriptors (“Site A,” “the Oklahoma City facility”) never the real nickname or code.
  3. Equipment IDs / serials → omit or replace with generic descriptors (“the redundant PDU branch,” “the affected DC plant”). Manufacturer model numbers (Tyco RR0153, Emerson, etc.) MAY stay if they serve technical context and don’t identify the customer through equipment-list cross-reference.
  4. Vendor / partner names → generic (“the third-party installer,” “the upstream feeder vendor”). Same caveat as equipment: vendor names that serve technical context (the rectifier manufacturer; the standards-body) MAY stay; vendor names that identify the customer’s procurement pattern must be replaced.
  5. Specific dollar amounts → bracket with caveats (“[anchor metric pending]”) or generalize (“low six figures”) unless the figure has been published in a sanctioned case study.
  6. Real Verdigris team member names → keep (Mark Chung, Thomas Chung, Mike Mahedy, Jon Chu are public per the existing voice-foundation policy).
  7. Internal channel names / Slack workspace identifiers → never include.

Examples

Example file Genre Origin Status GTM review
apex-telecom-rectifier-portfolio-diagnostic.html portfolio-diagnostic Telecom-derived; identity fully substituted per LEARNINGS.md PII discipline. Customer name “Apex Telecom” is a fictional placeholder; site names generalized to Site 1-9; DC plant identifiers genericized to Plant A/B with subscript identifiers. IEEE 519 thresholds, THD-57 values, Gate A/B logic, Category B/A multi-tier finding pattern, peer-comparison mechanics, methodology rigor preserved verbatim from source. Manufacturer references (Tyco, Emerson) preserved as public technical context. SHIPPED Not required (fully anonymized; no real customer identifying strings remain)

Engagement-shape preservation: what stays

The technical content IS the value. When anonymizing, preserve:

What gets stripped

Verification

After anonymization, the example must pass the compliance audit:

npm run audit:compliance -- categories/service-reports/examples/{file}.html

If the audit references original (real-customer) strings as findings, allowlist them with a comment explaining the workaround (matches cohesion-audit-ignore pattern).

See also