Alerts API
Configure alert rules and notification channels programmatically. Set up complex routing, escalation policies, and custom notification logic.
Alert Rules
List Alert Rules
GET /v1/alert-rules
# Query parameters:
?enabled=true # Filter by status
?monitor_id=mon_abc # Filter by monitorResponse
{
"data": [
{
"id": "rule_abc123",
"name": "Production API Down",
"enabled": true,
"conditions": {
"consecutiveFailures": 3,
"latencyThresholdMs": 2000
},
"channels": ["chan_slack", "chan_email"],
"monitors": ["mon_abc123"],
"createdAt": "2026-01-15T10:00:00Z"
}
]
}Create Alert Rule
POST /v1/alert-rules
Content-Type: application/json
{
"name": "Critical API Failures",
"enabled": true,
"conditions": {
"consecutiveFailures": 3,
"latencyThresholdMs": 2000
},
"channels": ["chan_slack_123", "chan_pagerduty_456"],
"monitors": ["mon_api_001", "mon_api_002"],
"notifyOnRecovery": true,
"escalationPolicy": {
"delayMinutes": 15,
"escalateToChannels": ["chan_pagerduty_oncall"]
}
}
# Response: 201 Created
{
"id": "rule_new_123",
"name": "Critical API Failures",
"enabled": true,
"createdAt": "2026-02-13T12:00:00Z"
}Update Alert Rule
PUT /v1/alert-rules/:id
Content-Type: application/json
{
"enabled": false,
"conditions": {
"consecutiveFailures": 5
}
}
# Response: 200 OK
{
"id": "rule_abc123",
"enabled": false,
"updatedAt": "2026-02-13T12:05:00Z"
}Delete Alert Rule
DELETE /v1/alert-rules/:id
# Response: 204 No ContentAlert Channels
List Alert Channels
GET /v1/alert-channels
# Query parameters:
?type=slack # Filter by typeResponse
{
"data": [
{
"id": "chan_slack_123",
"name": "Engineering Team",
"type": "slack",
"enabled": true,
"config": {
"webhookUrl": "https://hooks.slack.com/...",
"channel": "#alerts"
},
"createdAt": "2026-01-15T10:00:00Z"
}
]
}Create Alert Channel
Slack Channel
POST /v1/alert-channels
Content-Type: application/json
{
"name": "Engineering Slack",
"type": "slack",
"config": {
"webhookUrl": "https://hooks.slack.com/services/YOUR/WEBHOOK/URL",
"channel": "#alerts",
"mentions": ["@oncall"]
}
}
# Response: 201 Created
{
"id": "chan_slack_new",
"name": "Engineering Slack",
"type": "slack",
"enabled": true
}Email Channel
{
"name": "Team Email",
"type": "email",
"config": {
"recipients": ["team@example.com", "oncall@example.com"]
}
}PagerDuty Channel
{
"name": "Production Incidents",
"type": "pagerduty",
"config": {
"integrationKey": "YOUR_PAGERDUTY_KEY",
"severity": "critical"
}
}Webhook Channel
{
"name": "Custom Integration",
"type": "webhook",
"config": {
"url": "https://api.example.com/webhooks/alerts",
"method": "POST",
"headers": {
"Authorization": "Bearer YOUR_TOKEN"
}
}
}Update Alert Channel
PUT /v1/alert-channels/:id
Content-Type: application/json
{
"enabled": false
}Delete Alert Channel
DELETE /v1/alert-channels/:id
# Response: 204 No ContentTest Alert Channel
Send a test alert to verify configuration:
POST /v1/alert-channels/:id/test
# Response: 200 OK
{
"success": true,
"message": "Test alert sent successfully",
"deliveredAt": "2026-02-13T12:10:00Z"
}Alert Conditions
| Condition | Description | Example |
|---|---|---|
consecutiveFailures | Number of consecutive failures | {"consecutiveFailures": 3} |
latencyThresholdMs | Response time threshold (ms) | {"latencyThresholdMs": 2000} |
errorRatePercent | Error rate percentage | {"errorRatePercent": 10} |
statusCode | Specific HTTP status code | {"statusCode": 500} |
availability | Uptime percentage threshold | {"availability": 99.5} |
Channel Types
| Type | Description | Config Fields |
|---|---|---|
email | Email notifications | recipients |
slack | Slack webhooks | webhookUrl, channel, mentions |
discord | Discord webhooks | webhookUrl |
telegram | Telegram bot | botToken, chatId |
pagerduty | PagerDuty incidents | integrationKey, severity |
opsgenie | Opsgenie alerts | apiKey, priority |
webhook | Custom HTTP endpoint | url, method, headers |
Escalation Policies
Escalate alerts to different channels if not acknowledged:
{
"conditions": {
"consecutiveFailures": 3
},
"channels": ["chan_slack"],
"escalationPolicy": {
"delayMinutes": 15,
"escalateToChannels": ["chan_pagerduty"],
"maxEscalations": 2
}
}
# Timeline:
# 0 min: Alert to Slack
# 15 min (if not ack): Escalate to PagerDuty
# 30 min (if not ack): Escalate again (up to maxEscalations)Alert Silencing
Silence Alerts for Monitor
POST /v1/monitors/:id/silence
Content-Type: application/json
{
"duration": 3600, # Seconds
"reason": "Scheduled maintenance"
}
# Response: 200 OK
{
"silencedUntil": "2026-02-13T13:00:00Z",
"reason": "Scheduled maintenance"
}Unsilence Alerts
DELETE /v1/monitors/:id/silence
# Response: 204 No ContentCode Examples
Python
from blacktide import Client
client = Client(api_token="YOUR_TOKEN")
# Create Slack channel
slack = client.alert_channels.create({
"name": "Engineering",
"type": "slack",
"config": {
"webhookUrl": "https://hooks.slack.com/...",
"channel": "#alerts"
}
})
# Create alert rule
rule = client.alert_rules.create({
"name": "API Failures",
"conditions": {"consecutiveFailures": 3},
"channels": [slack.id],
"monitors": ["mon_api_001"]
})
# Test channel
client.alert_channels.test(slack.id)JavaScript/TypeScript
import { BlackTide } from '@blacktide/sdk';
const client = new BlackTide('YOUR_TOKEN');
// Create email channel
const email = await client.alertChannels.create({
name: 'Team Email',
type: 'email',
config: {
recipients: ['team@example.com']
}
});
// Create rule with escalation
const rule = await client.alertRules.create({
name: 'Critical Failures',
conditions: { consecutiveFailures: 5 },
channels: [email.id],
escalationPolicy: {
delayMinutes: 15,
escalateToChannels: [pagerdutyChannel.id]
}
});Next Steps
- Monitors API: Create monitors to trigger alerts
- Alert Rules: Detailed configuration guide
- Integrations: Slack, Discord, PagerDuty setup