Custom Compliance Frameworks
Not all compliance requirements fit standard frameworks. Custom Compliance Frameworks let you build tailored frameworks for your organization’s unique regulatory needs, internal policies, or industry-specific requirements.
Overview
Custom frameworks allow you to:
- Define custom controls — Create control objectives and requirements specific to your organization
- Map evidence sources — Link controls to actual evidence (Jira tickets, logs, documentation, manual uploads)
- Run gap analysis — Automatically identify which controls have sufficient evidence and which need attention
- Cross-map to standards — Align custom controls to standard frameworks (SOC 2, ISO 42001, etc.) for regulatory credibility
Creating a Custom Framework
Step 1: Framework Basics
- Go to Compliance → Frameworks → Custom Frameworks
- Click Create Custom Framework
- Enter framework details:
- Name — Descriptive name (e.g., “Internal AI Safety Requirements”, “Healthcare Data Handling Policy”)
- Description — What this framework covers (optional but recommended)
- Effective Date — When this framework applies
- Owner — Who’s responsible for maintaining this framework
- Version — Track framework iterations (default: 1.0)
Step 2: Define Control Categories
Organize controls into logical groups (optional but recommended for large frameworks):
Healthcare AI Safety Framework├─ Data Protection│ ├─ Patient Data Classification│ ├─ Encryption at Rest│ └─ Access Logging├─ Model Governance│ ├─ Model Training Approval│ ├─ Model Validation│ └─ Model Monitoring└─ Compliance & Audit ├─ Documentation ├─ Incident Response └─ Audit TrailClick Add Category for each group.
Step 3: Define Controls
For each category, define individual controls:
-
Click Add Control in a category
-
Enter control details:
- Control ID — Unique identifier (e.g., “HDAI-1.1”, “PHI-2.3”)
- Title — Concise control name
- Description — What this control requires
- Objective — Business purpose or regulation citation
- Assessment Method — How this control is verified (e.g., “Automated logs”, “Manual review”, “Interview”)
-
Example control:
Control ID: HDAI-1.1Title: Patient Data ClassificationDescription: All data sources containing patient information must be classified as such and documented.Objective: Ensure patient data is handled with appropriate care per HIPAA requirementsAssessment Method: Manual review of data inventory + source code scanning for patient data patternsStep 4: Evidence Mapping
For each control, specify what evidence proves compliance:
-
Click Map Evidence on a control
-
Select evidence sources:
- Jira Tickets — Specific issues proving implementation
- Code Commits — Git commits implementing the control
- Configuration Files — System settings proving compliance
- Logs — Audit/access logs from systems
- Documentation — Policies, runbooks, design docs
- Manual Uploads — Evidence files (PDFs, screenshots, etc.)
- External Systems — Okta logs, AWS CloudTrail, etc.
-
For each evidence source, define:
- Query/Filter — How to find relevant evidence (e.g., “Jira issues with label=‘hipaa-compliance’”)
- Validation Rules — What makes evidence “sufficient” (e.g., “At least 1 closed issue required”)
- Freshness — How recent evidence must be (e.g., “Evidence created in past 90 days”)
-
Example mapping:
Control: HDAI-1.1 (Patient Data Classification)
Evidence Source 1: Jira Tickets├─ Query: status=Done AND label=patient-data-classification├─ Validation: At least 1 closed issue└─ Freshness: Completed in past 180 days
Evidence Source 2: Code Scanning├─ Query: Committed files matching pattern: **/patient-data*.py├─ Validation: Contains 'PATIENT_DATA' classification markers└─ Freshness: Any commit in past year
Evidence Source 3: Manual Upload├─ Description: Data Classification Policy document├─ Validation: PDF uploaded and signed by Data Officer└─ Freshness: Renewed annuallyRunning Gap Analysis
Gap analysis identifies which controls have sufficient evidence and which need attention:
Automatic Gap Detection
- Go to Custom Frameworks → [Framework Name]
- Click Run Gap Analysis
- TruthVouch automatically:
- Queries all mapped evidence sources
- Validates evidence against assessment method requirements
- Assigns each control a status
Gap Analysis Results
Controls are classified as:
Control Status Overview (Healthcare AI Safety)
✓ Satisfied (18 controls) └─ Sufficient evidence found; control is compliant
⚠ Partial (7 controls) └─ Some evidence found, but not meeting validation criteria └─ Example: Only 1 recent test result, need 2+ per policy
✗ Not Satisfied (5 controls) └─ Little to no evidence found; action required
? Not Assessed (2 controls) └─ Assessment method requires manual review; schedule manuallyRemediation View
For each gap, see recommended actions:
Control: HDAI-3.2 (Model Validation Testing)Status: Not SatisfiedIssue: No evidence of model validation tests in past 90 daysRecommendation: Run validation test suite for deployed models
Suggested Actions:1. Execute model_validation.py for all models in production2. Upload test results to shared evidence repository3. Assign Jira task to ML team: "Run HDAI-3.2 validation tests"
Estimated Effort: 4 hoursDue Date: 2024-04-15 (90 days from framework effective date)Mapping to Standard Frameworks
Align your custom controls to standard frameworks for regulatory credibility:
- Click Cross-Map Controls
- For each custom control, map to equivalent standard control:
Custom Control Standard Framework Control━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━HDAI-1.1 (Patient Data → ISO 42001 A.5.1 (Data Management)Classification) → SOC 2 CC6.1 (Logical Access) → HIPAA §164.312(a)(2)(i)Benefits of cross-mapping:
- Regulatory alignment — Prove your custom framework covers standard requirements
- Efficiency — One control satisfies multiple frameworks
- Audit credibility — Auditors see standard framework mapping
- Compliance reporting — Generate reports showing how custom framework aligns with regulations
Using Custom Frameworks in Compliance Scans
Once defined, custom frameworks can be used in compliance scans:
-
Create a Compliance Scan → Select frameworks
-
Include custom frameworks alongside standard ones
-
Run scan like any other framework:
Selected Frameworks:├─ SOC 2 Type II (50 controls)├─ ISO 42001 (45 controls)└─ Healthcare AI Safety (30 custom controls) ← NEW -
Scan results include custom controls with same reporting as standard frameworks:
- Control-by-control status
- Evidence inventory
- Gap analysis with remediation tasks
- Audit-ready reports
Version Management
Track framework changes over time:
Create New Version
When framework requirements change:
- Go to Custom Framework → [Framework] → Versions
- Click Create New Version
- Make changes (add controls, update requirements, etc.)
- Set Effective Date (when old version expires)
- Archive old version or maintain for historical compliance
Version Comparison
Compare two versions to see what changed:
Comparison: v1.0 → v2.0
Added Controls (3): + HDAI-4.1 Model Explainability + HDAI-4.2 Bias Detection + HDAI-4.3 Model Drift Monitoring
Modified Controls (2): ≈ HDAI-1.1 Patient Data Classification └─ Updated Evidence Freshness from 180 → 90 days
≈ HDAI-3.2 Model Validation Testing └─ Added requirement for quarterly (not annual) testing
Removed Controls (1): - HDAI-5.1 Legacy Data Retention (superseded by HDAI-1.3)Compliance Automation
Automatically track custom framework compliance over time:
Scheduled Gap Analysis
Set up automatic scans:
-
Custom Framework → [Framework] → Settings
-
Under “Automated Compliance”, enable:
- Frequency — Weekly, monthly, quarterly, or on-demand
- Alerts — Notify when new gaps detected
- Auto-Remediation Tasks — Create Jira/ServiceNow tickets for gaps
-
Example automation:
Frequency: Monthly (1st of month, 2 AM)Evidence Freshness: 60 daysAlert Threshold: Any new gaps or stale evidenceAuto-Create Tasks: Yes ├─ Assignee: Framework owner ├─ Priority: High ├─ Due Date: 30 days from gap detection └─ Description: Auto-generated from gap analysisEvidence Collection Automation
Automatically pull evidence from connected systems:
- Continuous ingestion from Jira, GitHub, Okta, etc.
- Auto-validation against control requirements
- Real-time status updates as evidence arrives
- No manual collection needed
Custom Framework Best Practices
Design
- Be specific — Define exactly what “satisfies” each control (validation rules)
- Map to evidence — Each control should have realistic evidence sources
- Keep it maintainable — 30-50 controls is typical; avoid 100+ control frameworks
- Version iteratively — Start simple, add complexity based on audit findings
Implementation
- Start small — Pilot with 1-2 critical frameworks before rolling out org-wide
- Train teams — Ensure control owners understand requirements and evidence
- Automate evidence — Use APIs and webhooks; don’t rely on manual uploads
- Review regularly — Quarterly reviews of framework relevance and effectiveness
Compliance
- Audit evidence — Ensure all evidence sources are auditable (with timestamps, ownership)
- Document decisions — Explain why each control maps to standard framework controls
- Archive versions — Keep historical versions for regulatory proof (auditors want to know what applied when)
- Cross-map carefully — Don’t overclaim; only map controls that are genuinely equivalent
Example Custom Frameworks
Scenario 1: Financial Services (In-House AI)
Framework: Internal AI Risk Management (IARM)Controls (25):├─ Model Governance (8)├─ Data Quality & Integrity (6)├─ Explainability & Fairness (5)├─ Testing & Validation (4)└─ Monitoring & Incident Response (2)
Cross-mapped to: NIST AI RMF, SOC 2, FINRAEvidence from: GitHub (model commits), Jira (testing), Datadog (monitoring), Manual (fairness analysis)Scan frequency: QuarterlyScenario 2: Healthcare (EU + US)
Framework: Healthcare AI Compliance (HAC)Controls (35):├─ Patient Data Protection (8)├─ Model Explainability (7)├─ Bias & Fairness (6)├─ Testing & Validation (5)├─ Monitoring & Drift (5)└─ Incident Response (4)
Cross-mapped to: HIPAA, GDPR, ISO 42001, FDA GuidanceEvidence from: Okta (access logs), AWS CloudTrail (data handling), GitHub (code), Manual (fairness audits)Scan frequency: Monthly