HIPAA Compliance for AI in Healthcare
HIPAA (Health Insurance Portability and Accountability Act) requires healthcare organizations and their vendors to protect patient health information (PHI). If your AI system processes, stores, or transmits PHI — including electronic health records (EHR), lab results, insurance claims, or genomic data — HIPAA applies. Compliance AI automates HIPAA compliance for healthcare AI: risk analysis, safeguard implementation, Business Associate Agreement (BAA) management, and breach notification.
What Is HIPAA?
HIPAA is the US federal law protecting patient privacy and security. It applies to:
- Covered entities: Hospitals, clinics, pharmacies, health plans, healthcare clearinghouses
- Business associates: Vendors processing PHI on behalf of covered entities (cloud providers, analytics platforms, AI vendors)
Maximum penalties: $100 to $50,000 per violation (up to $1.5M per category per year)
HIPAA Rules for AI Systems
Privacy Rule (45 CFR Parts 160 & 164)
Restricts how PHI can be used and disclosed.
Key requirements:
- Use/disclose PHI only for treatment, payment, operations, or with authorization
- Provide privacy notices to patients
- Honor patient requests to access, amend, or restrict use
- Document PHI inventory and uses
- Limit PHI access to minimum necessary
For AI systems:
- Document why AI needs access to PHI
- Limit training data to what’s necessary
- Don’t sell PHI without written authorization
- De-identify data before using for research/analytics
Security Rule (45 CFR Parts 160 & 164)
Requires administrative, physical, and technical safeguards for PHI.
Administrative safeguards:
- Information security officer designated
- Security training for staff
- Incident response procedures
- Authorization and supervision policies
Physical safeguards:
- Restricted access to facilities/servers storing PHI
- Workstation security policies
- Audit logs for access to PHI
Technical safeguards:
- Access control (unique IDs, emergency access procedures)
- Encryption (data in transit and at rest)
- Audit controls (logs of who accessed what)
- Integrity controls (detect unauthorized modification)
For AI systems:
- Encrypt model training data
- Audit model inference API access
- Implement role-based access control
- Monitor unusual access patterns
Breach Notification Rule (45 CFR Part 164 Subpart D)
If PHI is breached (unauthorized access, loss, disclosure, or acquisition):
- Assess — Does breach pose significant risk to individuals?
- Notify individuals — Within 60 days (specific notification requirements)
- Notify media — If breach affects 500+ residents in a jurisdiction
- Notify HHS — Without delay (Office for Civil Rights)
HIPAA defines “breach” narrowly: Unauthorized acquisition, access, or use of PHI that creates risk of compromise. Not a breach if PHI is accessed but not actually compromised.
For AI systems:
- Monitor for unauthorized access
- If model outputs are exposed → breach
- If training data is leaked → breach
- Implement safeguards to prevent/detect breaches
Business Associate Agreements (BAA)
If you’re a vendor providing AI services to healthcare organizations, you need a Business Associate Agreement (BAA).
What Is a BAA?
A legal contract between a covered entity and vendor that:
- Defines what PHI the vendor accesses
- Restricts how vendor can use/disclose PHI
- Requires vendor to implement HIPAA Security Rule safeguards
- Requires vendor to permit audits
- Requires vendor to notify covered entity of breaches
- Requires vendor to return/destroy PHI when relationship ends
BAA Requirements
Your BAA must include:
| Requirement | What Vendor Must Do |
|---|---|
| Permitted uses | Only use PHI as specified by covered entity (treatment, payment, operations) |
| Subcontractors | If you use subcontractors (cloud providers), they must also have BAAs |
| Safeguards | Implement HIPAA Security Rule safeguards equivalent to covered entity |
| Breach notification | Notify covered entity of breaches within 60 days |
| Audit & inspection | Allow covered entity and HHS to audit records |
| Data return/destruction | Return or destroy PHI when relationship ends |
| Authorization for disclosure | Cannot disclose PHI without covered entity authorization |
| Minimum necessary | Disclose only minimum PHI needed for specified purpose |
| Individual rights | Assist covered entity in honoring patient rights (access, amend, restrict) |
TruthVouch support:
- BAA template (customizable)
- HIPAA compliance checklist for vendors
- Security safeguards evidence collection
HIPAA Compliance Roadmap for Healthcare AI
Step 1: Risk Analysis (2-4 weeks)
Assess security risks to PHI in your AI system:
-
Identify assets:
- What PHI does the system access? (patient names, IDs, medical records, lab results, genomics)
- Where is PHI stored? (database, files, logs)
- How is PHI transmitted? (API, batch exports, sync services)
-
Identify threats:
- Unauthorized access (hacking, insider threat)
- Data loss (deletion, export)
- Interception (PHI transmitted unencrypted)
- Malware/ransomware
- Physical theft
-
Assess vulnerabilities:
- Weak passwords or MFA?
- Unencrypted storage?
- Overly permissive access controls?
- Outdated software with known exploits?
- Inadequate logging?
-
Evaluate likelihood and impact:
- For each threat, assess: likelihood (rare, uncommon, common) and impact (low, moderate, high)
- High-likelihood + high-impact risks get priority
-
Document findings:
- Risk report
- Risk register
- Mitigation prioritization
Compliance AI support:
- Risk analysis questionnaire
- Auto-generates risk register
- Prioritizes high-risk vulnerabilities
- Generates recommendations
Step 2: Implement Security Safeguards (2-4 months)
Address identified risks by implementing HIPAA Security Rule safeguards:
Administrative
- Designate security officer → assign responsibility
- Conduct security training → all staff
- Create policies:
- Access control (who can access PHI)
- Workstation use (acceptable uses of computers accessing PHI)
- Incident response (procedures for breach)
- Sanction policy (discipline for HIPAA violations)
- Document security procedures
Compliance AI support:
- Policy templates
- Training program builder
- Incident response playbooks
Physical
- Restrict data center access → locks, badge readers
- Monitor server rooms → camera, logs
- Workstation security → screen locks, cable locks
- Device & media controls → encrypted laptops, secure destruction of retired drives
Compliance AI support:
- Physical safeguard checklist
- Evidence collection (photos, access logs)
Technical
-
Access control:
- Unique user IDs (not shared credentials)
- Multi-factor authentication for PHI access
- Role-based access (doctors see patient records, billing sees claims)
- Audit logs (who accessed what, when)
- Emergency access procedures (if usual access fails)
-
Encryption:
- Encrypt PHI at rest (database, files on disk)
- Encrypt PHI in transit (HTTPS/TLS for APIs)
- Key management (secure generation, storage, rotation, destruction)
-
Integrity:
- Detect unauthorized modification of PHI
- Checksums or digital signatures on critical records
-
Audit controls:
- Log all access to PHI systems
- Retention of logs (6+ years typical)
- Regular review of logs for suspicious activity
Compliance AI support:
- Encryption audit (checks if data at rest/transit encrypted)
- Access control assessment (role review, MFA audit)
- Audit log configuration
- Integration with infrastructure connectors (AWS CloudTrail, Azure Monitor, etc.)
Step 3: Vendor Assessment (if using cloud AI platforms)
If you use cloud providers (AWS Sagemaker, Azure ML, Google Cloud AI) or third-party AI platforms:
-
Assess their HIPAA compliance:
- Do they have BAA available?
- What HIPAA safeguards do they implement?
- Where is PHI stored? (Data residency)
- Can you audit their controls?
-
Execute Business Associate Agreement (BAA)
- Define PHI categories vendor can access
- Specify allowable uses (training, inference)
- Include data return/destruction clause
-
Document in vendor risk register
Compliance AI support:
- Vendor risk assessment questionnaire
- BAA template
- Vendor scorecard
Step 4: Documentation (1-2 weeks)
Compile HIPAA documentation:
-
Privacy documentation:
- Notice of Privacy Practices (NPP) — patient-facing privacy policy
- Uses and disclosures inventory
- Patient authorization forms (if applicable)
-
Security documentation:
- Risk analysis report
- Security standards assessment
- Policies and procedures manual
- Training records
- Audit log samples
-
Breach documentation:
- Breach response procedures
- Breach notification templates
- HHS notification process
Store in compliance records for audit access.
Compliance AI support:
- Generates documentation packages
- Evidence mapping (links safeguards to requirements)
- Audit readiness report
Step 5: Ongoing Monitoring & Compliance (monthly)
-
Monthly:
- Review access logs for anomalies
- Check encryption status
- Verify backups completed successfully
-
Quarterly:
- Test disaster recovery/business continuity
- Review incident response procedures
- Update risk register if systems/threats change
-
Annually:
- Recertify that security safeguards are in place
- Conduct security training refresher
- Perform full risk analysis
- Audit vendor compliance
- Report compliance status to governance/board
Compliance AI support:
- Automated monitoring dashboards
- Monthly compliance reports
- Risk re-assessment workflow
- Training tracker
FDA AI Guidance for Medical Devices
If your AI is a medical device (diagnoses or treats disease), FDA oversight applies.
What Triggers FDA Oversight?
AI is a medical device if it:
- Helps diagnose disease (diagnostic imaging AI, lab test interpretation)
- Helps treat disease (dosing recommendation, surgery guidance)
- Prevents disease (risk prediction)
Example: Skin cancer detection AI = medical device
FDA Regulatory Pathways
| Pathway | Risk Level | Example |
|---|---|---|
| De Novo | Novel, low-moderate risk | Novel AI diagnostic algorithm |
| 510(k) | Moderate risk, predicate device exists | AI analyzing existing imaging type |
| PMA (Premarket Approval) | High risk | AI making autonomous medical decisions |
FDA Expectations for AI
FDA guidance (21st Century Cures Act) requires:
-
Good Machine Learning Practice (GMLP):
- Development practices (version control, testing)
- Validation (training/test data quality, performance metrics)
- Monitoring (performance drift detection)
-
Data quality:
- Representative training data
- Bias assessment
- Data provenance documentation
-
Model performance:
- Accuracy, sensitivity, specificity on real-world test set
- Performance across subpopulations (demographic parity)
-
Real-world performance monitoring:
- Post-market surveillance
- Alert if performance degrades
- Retraining or action plan if needed
Compliance AI support:
- FDA readiness assessment
- Training data quality documentation
- Bias testing results
- Performance monitoring setup
- Real-world data integration
Comparing HIPAA to Other Frameworks
| Framework | Focus | Applies To |
|---|---|---|
| HIPAA | Patient data privacy/security | Healthcare organizations & vendors |
| GDPR | EU personal data protection | Any org processing EU resident data |
| NIST Cybersecurity Framework | General cybersecurity | Any organization (HIPAA builds on it) |
| FDA AI Guidance | Medical devices | AI used to diagnose/treat disease |
If healthcare AI + EU users: Implement both HIPAA and GDPR
Common HIPAA Violations for AI
- Insufficient access control — Too many staff can view PHI
- Unencrypted data — PHI stored in plaintext or transmitted over HTTP
- Inadequate audit logs — Cannot trace who accessed PHI
- No breach response plan — Delayed breach notification
- Vendor non-compliance — Using subcontractors without BAAs
- Inadequate risk analysis — Risks not formally assessed
- Insufficient training — Staff unaware of HIPAA requirements
Compliance AI flags all of these automatically.
Next Steps
- Run HIPAA risk analysis: Go to Compliance > Frameworks > HIPAA > Assessment
- Set up access audit: Evidence Connectors
- Conduct data quality assessment: AI System Registry
- Document safeguards: Policies & Control Management