Skip to content

HIPAA Compliance for AI in Healthcare

HIPAA (Health Insurance Portability and Accountability Act) requires healthcare organizations and their vendors to protect patient health information (PHI). If your AI system processes, stores, or transmits PHI — including electronic health records (EHR), lab results, insurance claims, or genomic data — HIPAA applies. Compliance AI automates HIPAA compliance for healthcare AI: risk analysis, safeguard implementation, Business Associate Agreement (BAA) management, and breach notification.

What Is HIPAA?

HIPAA is the US federal law protecting patient privacy and security. It applies to:

  • Covered entities: Hospitals, clinics, pharmacies, health plans, healthcare clearinghouses
  • Business associates: Vendors processing PHI on behalf of covered entities (cloud providers, analytics platforms, AI vendors)

Maximum penalties: $100 to $50,000 per violation (up to $1.5M per category per year)

HIPAA Rules for AI Systems

Privacy Rule (45 CFR Parts 160 & 164)

Restricts how PHI can be used and disclosed.

Key requirements:

  • Use/disclose PHI only for treatment, payment, operations, or with authorization
  • Provide privacy notices to patients
  • Honor patient requests to access, amend, or restrict use
  • Document PHI inventory and uses
  • Limit PHI access to minimum necessary

For AI systems:

  • Document why AI needs access to PHI
  • Limit training data to what’s necessary
  • Don’t sell PHI without written authorization
  • De-identify data before using for research/analytics

Security Rule (45 CFR Parts 160 & 164)

Requires administrative, physical, and technical safeguards for PHI.

Administrative safeguards:

  • Information security officer designated
  • Security training for staff
  • Incident response procedures
  • Authorization and supervision policies

Physical safeguards:

  • Restricted access to facilities/servers storing PHI
  • Workstation security policies
  • Audit logs for access to PHI

Technical safeguards:

  • Access control (unique IDs, emergency access procedures)
  • Encryption (data in transit and at rest)
  • Audit controls (logs of who accessed what)
  • Integrity controls (detect unauthorized modification)

For AI systems:

  • Encrypt model training data
  • Audit model inference API access
  • Implement role-based access control
  • Monitor unusual access patterns

Breach Notification Rule (45 CFR Part 164 Subpart D)

If PHI is breached (unauthorized access, loss, disclosure, or acquisition):

  1. Assess — Does breach pose significant risk to individuals?
  2. Notify individuals — Within 60 days (specific notification requirements)
  3. Notify media — If breach affects 500+ residents in a jurisdiction
  4. Notify HHS — Without delay (Office for Civil Rights)

HIPAA defines “breach” narrowly: Unauthorized acquisition, access, or use of PHI that creates risk of compromise. Not a breach if PHI is accessed but not actually compromised.

For AI systems:

  • Monitor for unauthorized access
  • If model outputs are exposed → breach
  • If training data is leaked → breach
  • Implement safeguards to prevent/detect breaches

Business Associate Agreements (BAA)

If you’re a vendor providing AI services to healthcare organizations, you need a Business Associate Agreement (BAA).

What Is a BAA?

A legal contract between a covered entity and vendor that:

  1. Defines what PHI the vendor accesses
  2. Restricts how vendor can use/disclose PHI
  3. Requires vendor to implement HIPAA Security Rule safeguards
  4. Requires vendor to permit audits
  5. Requires vendor to notify covered entity of breaches
  6. Requires vendor to return/destroy PHI when relationship ends

BAA Requirements

Your BAA must include:

RequirementWhat Vendor Must Do
Permitted usesOnly use PHI as specified by covered entity (treatment, payment, operations)
SubcontractorsIf you use subcontractors (cloud providers), they must also have BAAs
SafeguardsImplement HIPAA Security Rule safeguards equivalent to covered entity
Breach notificationNotify covered entity of breaches within 60 days
Audit & inspectionAllow covered entity and HHS to audit records
Data return/destructionReturn or destroy PHI when relationship ends
Authorization for disclosureCannot disclose PHI without covered entity authorization
Minimum necessaryDisclose only minimum PHI needed for specified purpose
Individual rightsAssist covered entity in honoring patient rights (access, amend, restrict)

TruthVouch support:

  • BAA template (customizable)
  • HIPAA compliance checklist for vendors
  • Security safeguards evidence collection

HIPAA Compliance Roadmap for Healthcare AI

Step 1: Risk Analysis (2-4 weeks)

Assess security risks to PHI in your AI system:

  1. Identify assets:

    • What PHI does the system access? (patient names, IDs, medical records, lab results, genomics)
    • Where is PHI stored? (database, files, logs)
    • How is PHI transmitted? (API, batch exports, sync services)
  2. Identify threats:

    • Unauthorized access (hacking, insider threat)
    • Data loss (deletion, export)
    • Interception (PHI transmitted unencrypted)
    • Malware/ransomware
    • Physical theft
  3. Assess vulnerabilities:

    • Weak passwords or MFA?
    • Unencrypted storage?
    • Overly permissive access controls?
    • Outdated software with known exploits?
    • Inadequate logging?
  4. Evaluate likelihood and impact:

    • For each threat, assess: likelihood (rare, uncommon, common) and impact (low, moderate, high)
    • High-likelihood + high-impact risks get priority
  5. Document findings:

    • Risk report
    • Risk register
    • Mitigation prioritization

Compliance AI support:

  • Risk analysis questionnaire
  • Auto-generates risk register
  • Prioritizes high-risk vulnerabilities
  • Generates recommendations

Step 2: Implement Security Safeguards (2-4 months)

Address identified risks by implementing HIPAA Security Rule safeguards:

Administrative

  • Designate security officer → assign responsibility
  • Conduct security training → all staff
  • Create policies:
    • Access control (who can access PHI)
    • Workstation use (acceptable uses of computers accessing PHI)
    • Incident response (procedures for breach)
    • Sanction policy (discipline for HIPAA violations)
  • Document security procedures

Compliance AI support:

  • Policy templates
  • Training program builder
  • Incident response playbooks

Physical

  • Restrict data center access → locks, badge readers
  • Monitor server rooms → camera, logs
  • Workstation security → screen locks, cable locks
  • Device & media controls → encrypted laptops, secure destruction of retired drives

Compliance AI support:

  • Physical safeguard checklist
  • Evidence collection (photos, access logs)

Technical

  • Access control:

    • Unique user IDs (not shared credentials)
    • Multi-factor authentication for PHI access
    • Role-based access (doctors see patient records, billing sees claims)
    • Audit logs (who accessed what, when)
    • Emergency access procedures (if usual access fails)
  • Encryption:

    • Encrypt PHI at rest (database, files on disk)
    • Encrypt PHI in transit (HTTPS/TLS for APIs)
    • Key management (secure generation, storage, rotation, destruction)
  • Integrity:

    • Detect unauthorized modification of PHI
    • Checksums or digital signatures on critical records
  • Audit controls:

    • Log all access to PHI systems
    • Retention of logs (6+ years typical)
    • Regular review of logs for suspicious activity

Compliance AI support:

  • Encryption audit (checks if data at rest/transit encrypted)
  • Access control assessment (role review, MFA audit)
  • Audit log configuration
  • Integration with infrastructure connectors (AWS CloudTrail, Azure Monitor, etc.)

Step 3: Vendor Assessment (if using cloud AI platforms)

If you use cloud providers (AWS Sagemaker, Azure ML, Google Cloud AI) or third-party AI platforms:

  1. Assess their HIPAA compliance:

    • Do they have BAA available?
    • What HIPAA safeguards do they implement?
    • Where is PHI stored? (Data residency)
    • Can you audit their controls?
  2. Execute Business Associate Agreement (BAA)

    • Define PHI categories vendor can access
    • Specify allowable uses (training, inference)
    • Include data return/destruction clause
  3. Document in vendor risk register

Compliance AI support:

  • Vendor risk assessment questionnaire
  • BAA template
  • Vendor scorecard

Step 4: Documentation (1-2 weeks)

Compile HIPAA documentation:

  1. Privacy documentation:

    • Notice of Privacy Practices (NPP) — patient-facing privacy policy
    • Uses and disclosures inventory
    • Patient authorization forms (if applicable)
  2. Security documentation:

    • Risk analysis report
    • Security standards assessment
    • Policies and procedures manual
    • Training records
    • Audit log samples
  3. Breach documentation:

    • Breach response procedures
    • Breach notification templates
    • HHS notification process

Store in compliance records for audit access.

Compliance AI support:

  • Generates documentation packages
  • Evidence mapping (links safeguards to requirements)
  • Audit readiness report

Step 5: Ongoing Monitoring & Compliance (monthly)

  1. Monthly:

    • Review access logs for anomalies
    • Check encryption status
    • Verify backups completed successfully
  2. Quarterly:

    • Test disaster recovery/business continuity
    • Review incident response procedures
    • Update risk register if systems/threats change
  3. Annually:

    • Recertify that security safeguards are in place
    • Conduct security training refresher
    • Perform full risk analysis
    • Audit vendor compliance
    • Report compliance status to governance/board

Compliance AI support:

  • Automated monitoring dashboards
  • Monthly compliance reports
  • Risk re-assessment workflow
  • Training tracker

FDA AI Guidance for Medical Devices

If your AI is a medical device (diagnoses or treats disease), FDA oversight applies.

What Triggers FDA Oversight?

AI is a medical device if it:

  • Helps diagnose disease (diagnostic imaging AI, lab test interpretation)
  • Helps treat disease (dosing recommendation, surgery guidance)
  • Prevents disease (risk prediction)

Example: Skin cancer detection AI = medical device

FDA Regulatory Pathways

PathwayRisk LevelExample
De NovoNovel, low-moderate riskNovel AI diagnostic algorithm
510(k)Moderate risk, predicate device existsAI analyzing existing imaging type
PMA (Premarket Approval)High riskAI making autonomous medical decisions

FDA Expectations for AI

FDA guidance (21st Century Cures Act) requires:

  1. Good Machine Learning Practice (GMLP):

    • Development practices (version control, testing)
    • Validation (training/test data quality, performance metrics)
    • Monitoring (performance drift detection)
  2. Data quality:

    • Representative training data
    • Bias assessment
    • Data provenance documentation
  3. Model performance:

    • Accuracy, sensitivity, specificity on real-world test set
    • Performance across subpopulations (demographic parity)
  4. Real-world performance monitoring:

    • Post-market surveillance
    • Alert if performance degrades
    • Retraining or action plan if needed

Compliance AI support:

  • FDA readiness assessment
  • Training data quality documentation
  • Bias testing results
  • Performance monitoring setup
  • Real-world data integration

Comparing HIPAA to Other Frameworks

FrameworkFocusApplies To
HIPAAPatient data privacy/securityHealthcare organizations & vendors
GDPREU personal data protectionAny org processing EU resident data
NIST Cybersecurity FrameworkGeneral cybersecurityAny organization (HIPAA builds on it)
FDA AI GuidanceMedical devicesAI used to diagnose/treat disease

If healthcare AI + EU users: Implement both HIPAA and GDPR

Common HIPAA Violations for AI

  1. Insufficient access control — Too many staff can view PHI
  2. Unencrypted data — PHI stored in plaintext or transmitted over HTTP
  3. Inadequate audit logs — Cannot trace who accessed PHI
  4. No breach response plan — Delayed breach notification
  5. Vendor non-compliance — Using subcontractors without BAAs
  6. Inadequate risk analysis — Risks not formally assessed
  7. Insufficient training — Staff unaware of HIPAA requirements

Compliance AI flags all of these automatically.

Next Steps