Regulatory Compliance Automation
Your company just passed its SOC 2 Type II audit, but the evidence collection took three engineers two full weeks. Half the work was copying logs from various systems into spreadsheets, writing narrative descriptions of controls, and screenshotting configurations that proved policies were enforced. Next quarter the auditor will ask for the same evidence again, and the engineer who knew where everything lived just left the company.
Compliance is not optional, but the manual labor surrounding it is. AI coding tools can automate policy enforcement at the code level, generate audit trails from your existing infrastructure, and produce compliance documentation that stays current without dedicated headcount.
What You’ll Walk Away With
Section titled “What You’ll Walk Away With”- Automated compliance-as-code patterns that enforce policies in CI/CD pipelines
- Pre-commit hooks that catch compliance violations before code reaches production
- AI-driven audit trail generation from Git history, infrastructure logs, and deployment records
- Prompts for generating SOC 2, HIPAA, and GDPR compliance evidence packages
- Security scanning automation integrated into your daily development workflow
The Compliance-as-Code Workflow
Section titled “The Compliance-as-Code Workflow”The core idea: encode your regulatory requirements as executable rules that run automatically. Instead of proving compliance retroactively, you prevent non-compliance from shipping.
-
Define your compliance requirements as machine-readable policies. Translate each control (SOC 2 CC6.1, HIPAA 164.312, GDPR Article 25) into a rule that can be checked programmatically.
-
Implement enforcement at multiple layers. Pre-commit hooks catch issues locally. CI/CD gates block non-compliant code from merging. Infrastructure-as-code scanners validate deployment configurations.
-
Generate audit evidence automatically. Every enforcement check produces a timestamped log entry. These logs become your audit trail.
-
Produce compliance reports on demand. AI aggregates enforcement logs, Git history, and infrastructure state into narrative reports that auditors expect.
Policy Enforcement in CI/CD
Section titled “Policy Enforcement in CI/CD”The most reliable compliance enforcement happens in your CI/CD pipeline. Code that violates a policy never reaches production because the pipeline rejects it.
Use Cursor’s agent mode to generate a compliance enforcement pipeline.
@codebase "Create a GitHub Actions workflow that enforces the following compliancepolicies before any code can be merged to main:
1. All secrets must be stored in environment variables, never hardcoded (scan for patterns like API keys, passwords, tokens)2. All database queries must use parameterized statements (no string concatenation in SQL)3. All user-facing endpoints must have authentication middleware4. All PII fields must be encrypted at rest (check schema definitions)5. All changes to auth-related files require two approvals
For each check, log the result to a compliance-evidence.json artifactwith timestamp, check name, result (pass/fail), and affected files.Generate the workflow file and any helper scripts needed."Cursor generates the workflow YAML, scanning scripts, and artifact collection logic. Review the generated rules against your actual regulatory requirements.
claude "Create a compliance enforcement system for our CI/CD pipeline.
Requirements:- GitHub Actions workflow that runs on every PR to main- Secret detection: scan for hardcoded API keys, passwords, and tokens- SQL injection prevention: verify parameterized queries- Authentication checks: ensure all public endpoints use auth middleware- PII protection: validate encryption on sensitive database fields- Approval gates: require 2 reviewers for changes to auth/ and security/ dirs
Each check should produce structured JSON output with: { timestamp, check_name, result, affected_files, evidence_hash }
Store results as workflow artifacts for audit trail.Generate all files needed: workflow YAML, scanning scripts, and acompliance-report-generator that summarizes results."Claude Code creates the files directly in your repository. Test the pipeline on a branch before merging.
Create a compliance enforcement system for GitHub Actions CI/CD:
Checks needed:1. Secret scanning (no hardcoded credentials in source)2. SQL injection prevention (parameterized queries only)3. Auth middleware on public endpoints4. PII encryption validation in database schemas5. Two-reviewer requirement for auth-related changes
Output structured JSON evidence for each check. Store as artifacts.Generate the workflow YAML and all scanning scripts.Codex generates the files as a task. Review the output before applying, since compliance rules must be exact.
Pre-Commit Compliance Hooks
Section titled “Pre-Commit Compliance Hooks”Pre-commit hooks give developers immediate feedback before code ever leaves their machine. This is faster than waiting for CI and reduces compliance violations at the source.
"Generate a pre-commit hook configuration (.pre-commit-config.yaml) thatenforces these compliance rules locally:
1. No secrets in committed files (use detect-secrets or gitleaks)2. No TODO/FIXME comments in files under src/security/3. All TypeScript files must have 'use strict' or strict mode enabled4. Database migration files must include a rollback step5. API route files must import from the auth middleware module
For each rule that fails, print a clear message explaining the violationand how to fix it. Also create a COMPLIANCE.md that documents each hookand the regulatory requirement it addresses."claude "Set up pre-commit hooks for compliance enforcement:
Rules:- Secret detection using gitleaks- No TODO/FIXME in security-critical files- Strict mode enforcement for TypeScript- Rollback requirement in database migrations- Auth middleware import check on API routes
Create .pre-commit-config.yaml and any custom hook scripts.Each hook should print a developer-friendly error message explainingwhat policy was violated and how to resolve it."Set up pre-commit compliance hooks:- gitleaks for secret detection- Custom hook: no TODO/FIXME in src/security/- Custom hook: database migrations must have rollback- Custom hook: API routes must import auth middlewareCreate .pre-commit-config.yaml and custom scripts.Automated Audit Trail Generation
Section titled “Automated Audit Trail Generation”Auditors need evidence that controls were enforced over time, not just at the moment of the audit. An automated audit trail captures enforcement data continuously.
Git-Based Audit Trail
Section titled “Git-Based Audit Trail”Your Git history already contains a rich audit trail. AI can extract compliance-relevant events from it.
Infrastructure Audit Trail
Section titled “Infrastructure Audit Trail”Infrastructure changes also need audit trails. If you use Terraform, CloudFormation, or Pulumi, every change is already versioned.
@codebase "Generate an infrastructure compliance report by analyzing:
1. All Terraform state changes in the past quarter2. Security group modifications (who changed what, when)3. IAM policy changes and their justifications from PR descriptions4. Encryption configuration status for all data stores5. Network access control changes
Cross-reference each change against our SOC 2 CC6.1 (logical access)and CC6.3 (network access) requirements. Flag any changes that lacka corresponding approval in the PR process.
Output as a structured report with evidence links to specific commits."claude "Generate an infrastructure compliance audit report.
Analyze Terraform state changes over the past 90 days.For each change, extract:- What resource was modified- Who made the change (Git author)- PR number and approval status- Whether the change affects security controls (security groups, IAM, encryption, network ACLs)
Map each finding to SOC 2 controls CC6.1, CC6.3, CC7.1.Flag any changes without proper PR approval.Output as markdown with executive summary and detailed findings."Generate infrastructure compliance report from Terraform state changes.Cover: security groups, IAM policies, encryption configs, network ACLs.Map findings to SOC 2 controls. Flag unapproved changes.Output as markdown audit report.SOC 2 Evidence Package Generation
Section titled “SOC 2 Evidence Package Generation”SOC 2 Type II audits require evidence that controls operated effectively over a period (usually 6-12 months). Generating this evidence manually is the most time-consuming part of the audit.
Automating Evidence Collection
Section titled “Automating Evidence Collection”"Generate a complete SOC 2 Type II evidence package for the past quarter.
For each Trust Service Criteria, collect the following:
CC6.1 (Logical Access Controls):- User access reviews (export from our identity provider)- Role-based access control documentation- MFA enforcement evidence (percentage of users with MFA enabled)- Privileged access inventory
CC6.2 (System Account Management):- Service account inventory with last rotation dates- Automated credential rotation evidence from CI/CD logs- Account lifecycle documentation (creation, modification, termination)
CC7.1 (System Boundaries):- Network architecture diagram (generate from Terraform)- Data classification matrix- Data flow diagrams for PII
CC8.1 (Change Management):- All PRs merged to main with approval status- CI/CD pipeline pass rates- Deployment logs with rollback instances- Change advisory board meeting notes (if applicable)
For each control, provide:1. Control description2. Evidence collected3. Testing results (effective/deficiency)4. Recommendations for improvement
Output as a structured document that an auditor can review directly."claude "Generate SOC 2 Type II evidence package for Q4.
Controls to cover:- CC6.1: Logical access controls (user reviews, RBAC, MFA)- CC6.2: System accounts (service account inventory, rotation)- CC7.1: System boundaries (network diagrams, data classification)- CC8.1: Change management (PR approvals, CI/CD logs, deployments)
For each control:- Describe what evidence is needed- Pull available evidence from Git history and CI/CD artifacts- Document any gaps where manual evidence is required- Rate effectiveness and note deficiencies
Format as auditor-ready document with executive summary."Generate SOC 2 Type II evidence package.Cover CC6.1, CC6.2, CC7.1, CC8.1 controls.Pull evidence from Git, CI/CD, and infrastructure configs.Flag gaps requiring manual evidence collection.Output as structured audit document.HIPAA Compliance Patterns
Section titled “HIPAA Compliance Patterns”HIPAA compliance requires specific technical safeguards for protected health information (PHI). AI tools can enforce these safeguards in code.
PHI Protection in Code
Section titled “PHI Protection in Code”// Example: AI-generated PHI access logging middlewareimport { auditLogger } from '~/lib/compliance/audit';import { classifyData } from '~/lib/compliance/data-classification';
export async function phiAccessMiddleware(request: Request, next: Function) { const classification = await classifyData(request);
if (classification.containsPHI) { await auditLogger.log({ timestamp: new Date().toISOString(), userId: request.auth.userId, action: request.method, resource: request.url, dataClassification: 'PHI', justification: request.headers.get('X-Access-Justification'), ipAddress: request.headers.get('X-Forwarded-For'), }); }
return next(request);}GDPR Compliance Automation
Section titled “GDPR Compliance Automation”GDPR requires specific capabilities: data subject access requests (DSARs), right to erasure, consent management, and data processing records.
Automating Data Subject Requests
Section titled “Automating Data Subject Requests”"Generate a DSAR (Data Subject Access Request) handler that:
1. Accepts a user identifier (email or user ID)2. Searches all data stores for records associated with that user: - PostgreSQL (users, orders, payments, support_tickets) - Redis cache (session data, preferences) - S3 (uploaded documents, profile images) - Analytics events (PostHog) - Email service (Resend delivery logs)3. Compiles all data into a structured JSON export4. Generates a human-readable PDF summary5. Logs the DSAR fulfillment for our processing records6. Can also handle right-to-erasure by deleting/anonymizing all records
Include proper error handling for partial failures (e.g., one data storeis unreachable). The response should indicate which sources weresuccessfully queried and which failed, so we can retry."claude "Build a GDPR DSAR handler.
Given a user email or ID, it should:- Query all data stores (PostgreSQL, Redis, S3, analytics, email logs)- Compile user data into JSON export- Generate PDF summary for the data subject- Log the request for Article 30 processing records- Support right-to-erasure (delete/anonymize across all stores)- Handle partial failures gracefully
Create the handler, data store adapters, and the API endpoint.Include tests for the happy path and partial failure scenarios."Build a GDPR DSAR handler that queries PostgreSQL, Redis, S3, andanalytics for all user data. Support data export (JSON + PDF) andright-to-erasure. Log all requests for Article 30 records.Handle partial failures with retry capability.Security Scanning Automation
Section titled “Security Scanning Automation”Continuous security scanning catches vulnerabilities before they reach production. The key is integrating scans into workflows developers already use, not adding separate security gates they learn to bypass.
Dependency Vulnerability Scanning
Section titled “Dependency Vulnerability Scanning”name: Compliance Security Scanon: pull_request: branches: [main] schedule: - cron: '0 6 * * 1' # Weekly Monday scan
jobs: dependency-scan: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Run dependency audit run: npm audit --json > audit-results.json - name: Analyze results run: | node scripts/compliance/analyze-audit.js \ --input audit-results.json \ --severity high,critical \ --output compliance-evidence/dependency-scan-$(date +%Y%m%d).json - name: Upload compliance evidence uses: actions/upload-artifact@v4 with: name: compliance-evidence-deps path: compliance-evidence/ retention-days: 365Static Analysis for Compliance
Section titled “Static Analysis for Compliance”"Create a static analysis configuration that checks for compliance-relevantpatterns in our codebase:
1. Data handling: PII must be logged with redaction, never raw values2. Authentication: All API endpoints must check auth before processing3. Encryption: Sensitive config values must use encrypted env vars4. Logging: Security events must use the structured audit logger5. Error handling: Error responses must not leak internal details
Use ESLint custom rules where possible. For checks ESLint cannot handle,create standalone analysis scripts. Each rule should reference thespecific compliance requirement it enforces (e.g., SOC2-CC6.1, HIPAA-164.312)."claude "Create compliance-focused static analysis rules.
Rules needed:- PII redaction in logs (no raw email, SSN, phone in log statements)- Auth middleware on all API routes- Encrypted env vars for sensitive config- Structured audit logging for security events- No internal details in error responses
Implement as ESLint custom rules where possible, standalone scriptsfor the rest. Tag each rule with its compliance requirement ID."Create static analysis rules for compliance:- PII redaction in logs- Auth checks on API endpoints- Encrypted config for secrets- Audit logging for security events- Safe error responsesUse ESLint custom rules. Tag each with compliance requirement ID.Regulatory Reporting Automation
Section titled “Regulatory Reporting Automation”Scheduled Compliance Reports
Section titled “Scheduled Compliance Reports”Automate weekly and quarterly compliance reports so the data is always current.
import { getGitActivity } from './sources/git';import { getCIResults } from './sources/ci';import { getSecurityScans } from './sources/security';import { getAccessReviews } from './sources/access';import { renderReport } from './templates/weekly';
async function generateWeeklyComplianceReport() { const period = { start: sevenDaysAgo(), end: now() };
const data = { gitActivity: await getGitActivity(period), ciResults: await getCIResults(period), securityScans: await getSecurityScans(period), accessReviews: await getAccessReviews(period), };
const report = renderReport({ ...data, controls: mapToControls(data), deficiencies: findDeficiencies(data), recommendations: generateRecommendations(data), });
await saveReport(report, `weekly-${period.end.toISOString()}`); await notifyComplianceTeam(report.summary);}When This Breaks
Section titled “When This Breaks”Compliance rules produce false positives. A secret scanner flags a test fixture containing a fake API key. Add an allowlist for known false positives, but review the allowlist quarterly to ensure it has not grown to hide real issues.
Audit trail has gaps. If developers push directly to main bypassing the PR process, the audit trail misses approvals. Enforce branch protection rules at the repository level, not just in CI. GitHub and GitLab both support requiring PR approvals as a repository setting.
DSAR handler misses a data store. When a new service is added, someone forgets to add it to the DSAR handler. Include a “data store registry” that all new services must register with, and add a CI check that verifies every database connection in the codebase has a corresponding DSAR adapter.
Compliance reports reference stale policies. The compliance documentation says you rotate credentials every 90 days, but the actual rotation script runs every 180 days. Automate the policy verification: the report generator should check actual rotation dates against the stated policy and flag discrepancies.
Pre-commit hooks slow down developers. If compliance hooks take more than 5 seconds, developers will skip them with --no-verify. Keep pre-commit checks fast (secret scanning, lint rules) and move slower checks (full dependency audit, infrastructure scanning) to CI.
Regulatory requirements change. When a regulation updates (such as new GDPR guidance or SOC 2 criteria revisions), your encoded policies need to change too. Subscribe to regulatory update feeds and schedule quarterly reviews of your compliance-as-code rules against current requirements.