Epic 15 — Audit Log & Compliance

Covers the immutable audit event feed, CSV/JSON export (server-side streaming preferred; client-side Web Worker fallback), the non-persistence guarantee, mandatory Schema-Strict Redaction Filter integration, and Auditor copy-event interception. This epic is the single canonical surface where the Schema-Strict Redaction Filter (surface: epic-15-*) is applied.

Personas: OP (full access + export) AU (read-only feed + redacted export)

Shared modules: RedactionFilter (Schema-Strict mode, surface: epic-15-*) audit-export-worker.ts CorrelationChip LastSyncedBadge EnvProvenance

Story 15.1 — View Immutable Audit Feed

As an
OP
I want
to browse a chronologically ordered feed of all auditable system events
So that
I can investigate security incidents, trace user actions, and verify policy compliance
Scenario: Audit feed loads and paginates
Giventhe user navigates to /admin/audit WhenGET /audit/events resolves Thena paginated feed is rendered with one row per event showing: event timestamp (created_at in ISO local time), event type (e.g. run_triggered, approval_approved, github_app_key_rotated, workflow_force_terminated), actor identity (user display name + role at time of event), and a one-line action summary; sorted by created_at descending
Scenario: Audit feed pre-redacted via Schema-Strict filter before DOM serialisation
Givenan audit event payload contains a value at a known-sensitive path (e.g. $.actor.token) Whenthe event row renders Thenthe RedactionFilter (Schema-Strict mode, surface: epic-15-feed) processes the payload before any serialisation to the DOM; sensitive values at known paths are replaced with [REDACTED]; a redaction.applied breadcrumb is emitted per masked field
Scenario: Filter audit feed by event type and date range
Whenthe user applies an event-type filter (project_archived) and a date range ThenGET /audit/events?type=project_archived&from=<date>&to=<date> is issued; only matching events are shown
Endpoint / DBPurpose
GET /audit/eventsPaginated audit feed; query params: type, from, to, actor_id, project_id

Story 15.2 — Server-Side Streaming Export (Preferred Path)

As an
OP
I want
to export the audit log as a pre-redacted CSV or JSON file streamed directly from the server
So that
the browser never sees raw sensitive values and the export is memory-efficient for large datasets
Scenario: Server-side CSV export initiated
Whenthe Operator clicks Export CSV ThenPOST /audit/export is issued with { "format": "csv" }; the server streams a pre-redacted CSV where sensitive values are masked server-side before transmission; the browser initiates a file download when the first chunk arrives Andthe response sets Content-Disposition: attachment; filename="audit-export-<date>.csv" and Cache-Control: no-store; the client never sees raw audit bytes in the response
Scenario: Export with filters applied
Giventhe Operator has applied a type=project_archived filter on the audit feed Whenthe Operator clicks Export CSV ThenPOST /audit/export is issued with { "format": "csv", "filters": { "type": "project_archived" } }; only filtered events appear in the export
Endpoint / DBPurpose
POST /audit/exportStream pre-redacted export; body: { "format": "csv"|"json", "filters": {...} }

Story 15.3 — Client-Side Web Worker Export (Fallback Path)

As an
OP
I want
the audit export to use a Web Worker when the deployment routes export through the browser
So that
the main thread is never blocked during large audit blob processing
Scenario: Export processed in audit-export-worker.ts — main thread stays responsive
Giventhe deployment uses the client-side export path and the Operator initiates an export of 100 000 rows Whenthe audit-export-worker.ts worker processes the blob Thenthe worker fetches audit chunks via ReadableStream; passes each chunk through the RedactionFilter (Schema-Strict mode) before emitting via postMessage; the UI shows a determinate progress indicator (Processed N / M rows); a regression test asserts requestAnimationFrame cadence on the main thread stays ≥ 50 fps throughout
Scenario: Worker torn down after export completes
Giventhe final chunk has been processed and the download has been triggered ThenURL.revokeObjectURL is called immediately after the download initiates; the worker's MessagePort is closed; the worker scope is released (garbage-collectible)
Endpoint / DBPurpose
GET /audit/export/streamClient-side export streaming endpoint; sets Cache-Control: no-store

Story 15.4 — Non-Persistence Guarantee

As an
OP
I want
proof that the audit export pipeline writes no bytes to any browser storage mechanism
So that
I can demonstrate to auditors that exported data does not persist in the browser after download
Regression: No localStorage / sessionStorage writes during export
Givenan audit export of 100 000 rows has been initiated and completed Thena regression test inspects localStorage and sessionStorage and asserts zero bytes attributable to the audit blob were written at any point during the export
Regression: No IndexedDB writes during export
Thenzero audit-blob-related records are found in any IndexedDB store after the export completes (verified via DevTools Storage API)
Regression: Cache-Control: no-store honoured — HTTP cache does not retain export
Giventhe export endpoint sets Cache-Control: no-store Thenthe browser HTTP cache does not retain the response; re-requesting the same URL confirms a fresh fetch occurs (not a cache hit)
Regression: Blob URL revoked immediately after download initiates
ThenURL.revokeObjectURL is called in the same tick as the click event on the download anchor; no reference to the Blob URL remains in component state after revocation

Story 15.5 — Auditor Redacted Feed and Copy Interception

As an
AU
I want
the audit feed to be pre-redacted before I can read it, and for copied text to also be redacted
So that
I can perform compliance review without being exposed to raw secrets or PII
Scenario: Audit feed rendered through Schema-Strict Redaction Filter for Auditor
Giventhe user has the Auditor role and an audit event payload contains a value at known-sensitive path $.payload.private_key_b64 Whenthe feed row renders Thenthe RedactionFilter (Schema-Strict mode, surface: epic-15-feed) processes the payload before DOM serialisation; the value at $.payload.private_key_b64 is replaced with [REDACTED]; non-sensitive fields (timestamps, actor names, event types) are rendered unmodified; a redaction.applied breadcrumb is emitted
Regression: Auditor copy interception — clipboard contains only redacted text
Giventhe Auditor selects a row in the audit feed that contains a [REDACTED] placeholder Whenthe Auditor presses Ctrl/Cmd-C Thenthe copy event is intercepted with event.preventDefault(); redactSelection(selectedText, 'schema_strict', 'epic-15-feed') is called; only the redacted projection is written to event.clipboardData; regression test asserts clipboard contains [REDACTED], never the raw sentinel value
Regression: Known-bad payload renders as [REDACTED] end-to-end
Givena test audit event is injected containing a fake API-key sentinel at $.actor.token Whenthe event is rendered in the feed and in an export initiated by an Auditor Thenboth the feed DOM and the export file contain [REDACTED] at the sentinel position; the raw sentinel value is never present in either surface (regression assertion)
Scenario: Auditor export limited to read-only redacted CSV only
Giventhe user has the Auditor role Whenthe Auditor accesses the export affordance Thenonly Export CSV (redacted) is offered; the export is redacted via Schema-Strict mode before delivery; no Export JSON (richer format) is offered to Auditors
Endpoint / DBPurpose
GET /audit/eventsAuditor receives same data; Schema-Strict redaction applied before DOM serialisation
POST /audit/exportAuditor export (CSV only, pre-redacted by Schema-Strict filter)