SOC 2 alignment checklist (AINL)
This document helps security and GRC teams map AINL deployments to common SOC 2 (Trust Services Criteria) expectations. It is not legal or compliance advice. Final control design, evidence collection, and auditor sign-of
SOC 2 alignment checklist (AINL)
This document helps security and GRC teams map AINL deployments to common SOC 2 (Trust Services Criteria) expectations. It is not legal or compliance advice. Final control design, evidence collection, and auditor sign-off are your organization’s responsibility.
Purpose
- Give security and GRC teams a structured starting point when AINL sits inside the control boundary.
- Tie product behaviors (policy gates, execution tape, compiler validation) to audit-friendly language.
- Complement the technical detail in
docs/validation-deep-dive.md.
Hosted runner and support
For SLA-backed execution, priority support, or managed runner options, see COMMERCIAL.md and your order form. Open-source AINL remains the reference compiler and runtime; commercial offerings address operational and support expectations that auditors often ask about for production workloads.
Control mapping (high level)
| TSC area | Typical control theme | How AINL helps (product / design) | Your environment completes |
|----------|----------------------|-------------------------------------|-----------------------------|
| CC6.1 — logical access | Restrict who can change workflows and what side effects are allowed | Policy gates and capability boundaries on adapters (e.g., email, HTTP, secrets, chain RPC) limit what compiled graphs can do per deploy; graphs are explicit, not prompt-hidden | IdP, RBAC on repos, secrets store, production promote process |
| CC6.6 — encryption / credentials | Protect credentials and sensitive data in transit and at rest | Adapters integrate with your configured stores; no hidden tool calls — effects are declared in the graph | TLS, KMS, vault policies, key rotation |
| CC7.2 — monitoring | Detect and respond to security events and anomalies | JSONL execution tape records step-level execution for incident review; reachability and strict diagnostics reduce “silent wrong branch” failures | SIEM forwarding, alerting, log retention, runbook |
| CC8.1 — change management | Changes are authorized, tested, and traceable | Strict compiler (ainl check --strict) and CI gates treat .ainl / graph IR as versioned artifacts; diffs are reviewable | Git policy, PR review, deployment pipeline |
| A1 — availability (optional) | Resilience of service commitments | Deterministic runtime behavior and explicit control flow aid predictable operations; hosted / commercial options may add SLAs | Infra redundancy, backups, DR |
Example: email-escalator workflow and CC6.1 / CC7.2
Scenario: A monitoring workflow (see examples/monitor_escalation.ainl and the OpenClaw email monitor) checks a signal (e.g., volume threshold) and escalates through policy-gated channels.
| Criterion | Mapping |
|-----------|---------|
| CC6.1 (logical access) | The graph declares which adapters run and under which gates. Escalation paths are not improvised by an LLM at runtime — access to “send mail / notify” is encoded in the compiled workflow and your deployment policy. Reviewers can answer who can change the graph via your Git and CI process. |
| CC7.2 (monitoring) | Running with JSONL execution tape (ainl run … --trace-jsonl …) produces a chronological record of node execution. For investigations, teams can correlate “what the workflow did” with replay-oriented analysis instead of reconstructing intent from chat transcripts. Pair tape retention and access control with your log management controls. |
Evidence ideas (illustrative): exported trace files for test runs, CI output from ainl check --strict, change tickets linking PRs to promoted graph versions, and screenshots or queries from your SIEM if you forward structured logs.
Production tape replay example
The repository includes a self-contained, core-only workflow intended for audit narratives and dry runs: examples/enterprise/audit-log-demo.ainl. It models a small monitoring slice with explicit policy thresholds (latency and error-rate stand-ins), explicit branches (within_policy vs policy_violation), and comments mapping behavior to CC7.2 (tape as monitoring evidence) and CC8.1 (versioned graph + strict checks).
How to capture and replay evidence
- From the repo root, validate the graph:
uv run ainl check examples/enterprise/audit-log-demo.ainl --strict - Run once and write a JSONL execution tape:
uv run ainl run examples/enterprise/audit-log-demo.ainl --trace-jsonl audit_demo.tape.jsonl - For auditor review, retain the
.ainlrevision (commit SHA), the strict check output, and the tape file(s) together. Re-running the same graph with the same inputs reproduces the branch outcome; the tape lists node-level steps in order.
This does not replace your SIEM, change tickets, or control testing — it gives a concrete, inspectable artifact that ties product behavior (policy gates + tape) to common SOC 2 discussion points.
Example: Bundling tape for auditor review
Below is a realistic excerpt of JSONL lines emitted by ainl run … --trace-jsonl for audit-log-demo.ainl (fields and shape match the runtime; timestamps are illustrative).
{"step_id": 3, "label": "_tick", "operation": "Set", "inputs": {"node_id": "n4", "step": {"op": "Set", "lineno": 37, "name": "max_error_bp", "ref": "20"}}, "output": 20, "outcome": "success", "timestamp": "2026-03-30T18:46:28.739Z", "user_reward": null}
{"step_id": 4, "label": "_tick", "operation": "X", "inputs": {"node_id": "n5", "step": {"op": "X", "lineno": 38, "dst": "lat_bad", "fn": "core.gt", "args": ["latency_ms", "max_latency_ms"]}}, "output": true, "outcome": "success", "timestamp": "2026-03-30T18:46:28.739Z", "user_reward": null}
{"step_id": 5, "label": "_violation", "operation": "Set", "inputs": {"node_id": "n1", "step": {"op": "Set", "lineno": 46, "name": "out", "ref": "audit:policy_violation"}}, "output": "audit:policy_violation", "outcome": "success", "timestamp": "2026-03-30T18:46:28.739Z", "user_reward": null}
{"step_id": 6, "label": "_violation", "operation": "J", "inputs": {"node_id": "n2", "step": {"op": "J", "lineno": 47, "var": "out"}}, "output": "audit:policy_violation", "outcome": "success", "timestamp": "2026-03-30T18:46:28.740Z", "user_reward": null}
{"step_id": 7, "label": "_tick", "operation": "If", "inputs": {"node_id": "n6", "step": {"op": "If", "lineno": 39, "cond": "lat_bad", "then": "_violation", "else": "_err_check"}}, "output": "audit:policy_violation", "outcome": "success", "timestamp": "2026-03-30T18:46:28.740Z", "user_reward": null}
- Immutable, diffable evidence: Each line is one execution step with
label,operation,outcome, and UTCtimestamp. You can store the file in git-secured storage or an evidence vault, then diff two tapes from different dates or graph versions to show what changed in behavior — complementing (not replacing) your change-management records. - Export: Write a tape with
uv run ainl run examples/enterprise/audit-log-demo.ainl --trace-jsonl audit-tape.jsonl(--trace-jsonltakes the output file path). Archive that file next to the exact.ainl(or IR) revision andainl check --strictoutput you used for the run. - Control mapping (illustrative): CC7.2 — the tape is structured audit logging of what executed (policy gate
lat_bad, branch to_violation, join result). CC8.1 — the graph is a versioned artifact; pairing the tape with the same commit SHA asainl check --strictshows authorized, testable change to production logic.
Next steps for your audit pack
- Map each row above to your actual IdP, secrets, SIEM, and change tickets.
- Reference
docs/validation-deep-dive.mdfor reachability, strict diagnostics, and tape semantics. - Replace marketing language with your control narratives before external “certification” claims.
- Involve counsel or a SOC 2 advisor before customer-facing compliance statements.
Related links
- Validation deep dive
- COMMERCIAL.md — hosted runner and support
- Community spotlights — real workflows (e.g., monitoring, cost reports)
- Production tape replay example —
examples/enterprise/audit-log-demo.ainl
