Real-Time Alerts
Get real-time notifications when AI outputs are flagged for hallucinations, accuracy issues, or policy violations. TruthVouch provides multiple channels including email, Slack, webhooks, and SignalR for instant alerting.
Alert Architecture
AI Output ↓[Verification Engine] ↓[Issue Detected?] ├─ YES → [Route to Alert Channels] │ ├─ Email │ ├─ Slack │ ├─ WebSocket (SignalR) │ ├─ Webhook │ └─ Integration (PagerDuty, Teams, etc) │ └─ NO → [Archive]Alert Types
| Type | Trigger | Severity | Use Case |
|---|---|---|---|
| Hallucination Detected | Confidence < threshold | Critical | LLM output unreliable |
| Citation Missing | Claim not cited | High | Governance violation |
| Policy Violation | Custom rule triggered | Variable | Custom business rule |
| Contamination Detected | Cross-domain bleed | High | Brand integrity risk |
| PII Exposure | Sensitive data in output | Critical | Compliance violation |
Setting Up Alerts
Basic Email Alerts
from truthvouch.client import TruthVouchClient
client = TruthVouchClient(api_key="your-api-key")
# Create alert rulerule = client.alerts.create_rule( name="Hallucination Alert", trigger="confidence < 0.7", channels=["email"], email_recipients=["ops@company.com"], enabled=True)
print(f"Alert rule created: {rule.id}")Multi-Channel Alerts
# Alert via multiple channelsrule = client.alerts.create_rule( name="Critical Issues Only", trigger="severity = critical", channels=[ { "type": "email", "recipients": ["critical@company.com"] }, { "type": "slack", "webhook_url": "https://hooks.slack.com/services/..." }, { "type": "pagerduty", "integration_key": "xxx" } ], enabled=True)Channel Configuration
Email Alerts
# Basic emailclient.alerts.configure_email( smtp_host="smtp.gmail.com", smtp_port=587, smtp_user="alerts@company.com", smtp_password="your-app-password", from_email="alerts@company.com")
# Template customizationclient.alerts.set_email_template( template_id="hallucination_alert", subject="Alert: Potential hallucination detected", body=""" An AI output was flagged for potential hallucination.
Query: {{query}} Response: {{response}} Confidence: {{confidence}}%
Review in dashboard: {{dashboard_url}} """)Slack Alerts
# Create Slack webhookslack_webhook = "https://hooks.slack.com/services/YOUR/WEBHOOK/URL"
rule = client.alerts.create_rule( name="Slack Hallucination Alerts", trigger="confidence < 0.7", channels=[ { "type": "slack", "webhook_url": slack_webhook, "mention": "@ops", "thread": False } ])Webhook Alerts
# Post alerts to your systemrule = client.alerts.create_rule( name="Custom Webhook", trigger="severity = critical OR severity = high", channels=[ { "type": "webhook", "url": "https://api.company.com/alerts", "headers": { "Authorization": "Bearer token123", "Content-Type": "application/json" }, "retry_count": 3, "retry_delay_seconds": 5 } ])SignalR (Real-Time WebSocket)
import * as signalr from "@microsoft/signalr";
// Connect to SignalR hubconst connection = new signalr.HubConnectionBuilder() .withUrl("https://api.truthvouch.com/alerts") .withAutomaticReconnect() .build();
// Listen for alertsconnection.on("alert", (alert) => { console.log("Alert received:", alert);
// Update UI updateAlertBadge(alert.severity); addAlertToList(alert);
// Play sound for critical alerts if (alert.severity === "critical") { playAlertSound(); }});
// Start connectionconnection.start().catch(err => console.error(err));Custom Alert Rules
Condition Syntax
# Confidence-basedconfidence < 0.7
# Category-basedcategory IN ["healthcare", "legal"] AND confidence < 0.9
# Text-basedtext CONTAINS "warning" OR text CONTAINS "error"
# Composite(confidence < 0.7 AND category = "financial") OR(pii_detected = true)Creating Complex Rules
# Rule 1: Healthcare needs very high confidencehealthcare_rule = client.alerts.create_rule( name="Healthcare Accuracy Gate", trigger=""" category = 'healthcare' AND confidence < 0.95 """, severity="critical", channels=["email", "slack"], enabled=True)
# Rule 2: Financial claims must have citationsfinance_rule = client.alerts.create_rule( name="Finance Citation Enforcement", trigger=""" category = 'financial' AND citations_count = 0 """, severity="high", channels=["slack", "webhook"], enabled=True)
# Rule 3: Batch alerts on repeated issuesbatch_rule = client.alerts.create_rule( name="Repeated Hallucinations", trigger=""" COUNT(confidence < 0.7) OVER (LAST_HOUR) > 5 """, severity="medium", channels=["email"], enabled=True)Alert Dashboard
Monitor all alerts in real-time:
import React, { useEffect, useState } from 'react';import { TruthVouchAlertProvider, useAlerts } from '@truthvouch/react';
function AlertDashboard() { const { alerts, subscribe } = useAlerts(); const [filter, setFilter] = useState('all');
useEffect(() => { // Subscribe to live alerts const unsubscribe = subscribe({ onAlert: (alert) => { console.log('New alert:', alert); } });
return unsubscribe; }, []);
const filteredAlerts = filter === 'all' ? alerts : alerts.filter(a => a.severity === filter);
return ( <div className="alert-dashboard"> <div className="alert-controls"> <button onClick={() => setFilter('critical')}> Critical ({alerts.filter(a => a.severity === 'critical').length}) </button> <button onClick={() => setFilter('high')}> High ({alerts.filter(a => a.severity === 'high').length}) </button> <button onClick={() => setFilter('all')}> All ({alerts.length}) </button> </div>
<div className="alert-list"> {filteredAlerts.map(alert => ( <AlertCard key={alert.id} alert={alert} /> ))} </div> </div> );}
function AlertCard({ alert }) { return ( <div className={`alert alert-${alert.severity}`}> <h4>{alert.title}</h4> <p>{alert.description}</p> <details> <summary>View Details</summary> <pre>{JSON.stringify(alert.data, null, 2)}</pre> </details> <div className="alert-actions"> <button>Acknowledge</button> <button>Suppress Rule</button> <button>Review</button> </div> </div> );}
export default AlertDashboard;Alert Handling Examples
Python: Processing Alerts
from truthvouch.webhooks import WebhookHandlerfrom flask import Flask, request
app = Flask(__name__)handler = WebhookHandler(api_key="your-api-key")
@app.route("/alerts/webhook", methods=["POST"])def handle_alert(): """Process incoming alert."""
# Verify webhook signature if not handler.verify_signature(request): return {"error": "Invalid signature"}, 401
alert = request.get_json()
# Route by severity if alert["severity"] == "critical": notify_on_call() escalate_to_senior_engineer()
elif alert["severity"] == "high": notify_team() create_jira_ticket(alert)
else: log_to_analytics(alert)
# Acknowledge receipt return {"status": "received"}, 200
def notify_on_call(): """Call on-call engineer.""" # Integration with PagerDuty, Opsgenie, etc pass
def escalate_to_senior_engineer(): """Escalate critical issues.""" pass
def notify_team(): """Notify team via Slack.""" pass
def create_jira_ticket(alert): """Auto-create Jira ticket.""" jira.create_issue( project="OPS", issue_type="Bug", summary=alert["title"], description=alert["description"], priority="High" )
def log_to_analytics(alert): """Log to analytics.""" analytics.track("alert", alert)TypeScript: Alert Reaction Handler
import { AlertClient } from "@truthvouch/sdk";
class AlertReactionHandler { async handleAlert(alert: Alert) { // Analyze alert const analysis = await this.analyzeAlert(alert);
// Determine action if (analysis.isPatterned) { await this.suppressRule(alert.ruleId); await this.notifyTeam( `Suppressed rule ${alert.ruleId}: ${analysis.reason}` ); } else if (analysis.isFalsePositive) { await this.reportFalsePositive(alert); } else { await this.escalate(alert); } }
private async analyzeAlert(alert: Alert): Promise<Analysis> { // Check if part of pattern const recentAlerts = await this.getRecentAlerts( alert.ruleId, 30 * 60 * 1000 // Last 30 minutes );
return { isPatterned: recentAlerts.length > 5, isFalsePositive: this.checkFalsePositive(alert), reason: "Too many similar alerts" }; }
private async suppressRule(ruleId: string) { // Temporarily disable rule to reduce noise await this.client.alerts.updateRule(ruleId, { enabled: false, suppressed_until: new Date(Date.now() + 24 * 60 * 60 * 1000) }); }}Alert Aggregation
Group related alerts to reduce noise:
# Aggregate similar alerts in 5-minute windowsaggregation_rule = client.alerts.create_rule( name="Hallucination Alert (Aggregated)", trigger="confidence < 0.7", aggregation={ "window_seconds": 300, # 5 minutes "group_by": ["category", "user_id"], "condition": "COUNT > 3" # Only alert if 3+ in window }, channels=["slack"], enabled=True)Alert Suppression
Suppress noisy rules temporarily:
# Suppress rule for 24 hoursclient.alerts.suppress_rule( rule_id="rule_abc123", duration_hours=24, reason="Too many false positives with new LLM version")
# Query suppressed rulessuppressed = client.alerts.list_suppressed_rules()Alert Metrics
Track alert performance:
# Get alert statsstats = client.alerts.get_stats( start_date="2024-03-01", end_date="2024-03-15")
print(f"Total alerts: {stats.total}")print(f"Critical: {stats.by_severity['critical']}")print(f"Acknowledged: {stats.acknowledged_count}")print(f"Avg response time: {stats.avg_response_time_minutes}m")Best Practices
Alert Design
- Avoid alert fatigue: Set appropriate thresholds
- Combine related alerts: Use aggregation rules
- Test thresholds: Run in parallel before deploying
- Monitor alert quality: Track false positive rate
Operational
- Route by expertise: Send healthcare alerts to compliance team
- Escalate on timeout: If ack not received in 30m, escalate
- Document suppression: Always include reason when suppressing
- Review weekly: Check alert logs for patterns
Integration
- Test webhooks: Verify endpoint before production
- Handle retries: Implement idempotency in handlers
- Monitor deliverability: Track delivery success rate
- Graceful degradation: Continue operation if channel fails
Troubleshooting
Q: Not receiving alerts
- Verify rule is enabled
- Check trigger condition (test with recent data)
- Confirm channel configuration (email deliverability)
- Check webhook endpoint is accessible
Q: Too many false alerts
- Increase confidence threshold
- Add context to trigger condition
- Use aggregation to group similar alerts
- Review recent model changes
Q: Webhook delivery delays
- Check target endpoint latency
- Implement async processing on receiver
- Reduce payload size if possible
- Use dead letter queue for failed deliveries
Next Steps
- Review Alert Channels for detailed setup
- Explore Alert Workflows for automation
- Check Webhook Events for event types