Skip to main content

Alerts API

Configure alert rules and notification channels programmatically. Set up complex routing, escalation policies, and custom notification logic.

Alert Rules

List Alert Rules

GET /v1/alert-rules

# Query parameters:
?enabled=true          # Filter by status
?monitor_id=mon_abc    # Filter by monitor

Response

{
  "data": [
    {
      "id": "rule_abc123",
      "name": "Production API Down",
      "enabled": true,
      "conditions": {
        "consecutiveFailures": 3,
        "latencyThresholdMs": 2000
      },
      "channels": ["chan_slack", "chan_email"],
      "monitors": ["mon_abc123"],
      "createdAt": "2026-01-15T10:00:00Z"
    }
  ]
}

Create Alert Rule

POST /v1/alert-rules
Content-Type: application/json

{
  "name": "Critical API Failures",
  "enabled": true,
  "conditions": {
    "consecutiveFailures": 3,
    "latencyThresholdMs": 2000
  },
  "channels": ["chan_slack_123", "chan_pagerduty_456"],
  "monitors": ["mon_api_001", "mon_api_002"],
  "notifyOnRecovery": true,
  "escalationPolicy": {
    "delayMinutes": 15,
    "escalateToChannels": ["chan_pagerduty_oncall"]
  }
}

# Response: 201 Created
{
  "id": "rule_new_123",
  "name": "Critical API Failures",
  "enabled": true,
  "createdAt": "2026-02-13T12:00:00Z"
}

Update Alert Rule

PUT /v1/alert-rules/:id
Content-Type: application/json

{
  "enabled": false,
  "conditions": {
    "consecutiveFailures": 5
  }
}

# Response: 200 OK
{
  "id": "rule_abc123",
  "enabled": false,
  "updatedAt": "2026-02-13T12:05:00Z"
}

Delete Alert Rule

DELETE /v1/alert-rules/:id

# Response: 204 No Content

Alert Channels

List Alert Channels

GET /v1/alert-channels

# Query parameters:
?type=slack    # Filter by type

Response

{
  "data": [
    {
      "id": "chan_slack_123",
      "name": "Engineering Team",
      "type": "slack",
      "enabled": true,
      "config": {
        "webhookUrl": "https://hooks.slack.com/...",
        "channel": "#alerts"
      },
      "createdAt": "2026-01-15T10:00:00Z"
    }
  ]
}

Create Alert Channel

Slack Channel

POST /v1/alert-channels
Content-Type: application/json

{
  "name": "Engineering Slack",
  "type": "slack",
  "config": {
    "webhookUrl": "https://hooks.slack.com/services/YOUR/WEBHOOK/URL",
    "channel": "#alerts",
    "mentions": ["@oncall"]
  }
}

# Response: 201 Created
{
  "id": "chan_slack_new",
  "name": "Engineering Slack",
  "type": "slack",
  "enabled": true
}

Email Channel

{
  "name": "Team Email",
  "type": "email",
  "config": {
    "recipients": ["team@example.com", "oncall@example.com"]
  }
}

PagerDuty Channel

{
  "name": "Production Incidents",
  "type": "pagerduty",
  "config": {
    "integrationKey": "YOUR_PAGERDUTY_KEY",
    "severity": "critical"
  }
}

Webhook Channel

{
  "name": "Custom Integration",
  "type": "webhook",
  "config": {
    "url": "https://api.example.com/webhooks/alerts",
    "method": "POST",
    "headers": {
      "Authorization": "Bearer YOUR_TOKEN"
    }
  }
}

Update Alert Channel

PUT /v1/alert-channels/:id
Content-Type: application/json

{
  "enabled": false
}

Delete Alert Channel

DELETE /v1/alert-channels/:id

# Response: 204 No Content

Test Alert Channel

Send a test alert to verify configuration:

POST /v1/alert-channels/:id/test

# Response: 200 OK
{
  "success": true,
  "message": "Test alert sent successfully",
  "deliveredAt": "2026-02-13T12:10:00Z"
}

Alert Conditions

ConditionDescriptionExample
consecutiveFailuresNumber of consecutive failures{"consecutiveFailures": 3}
latencyThresholdMsResponse time threshold (ms){"latencyThresholdMs": 2000}
errorRatePercentError rate percentage{"errorRatePercent": 10}
statusCodeSpecific HTTP status code{"statusCode": 500}
availabilityUptime percentage threshold{"availability": 99.5}

Channel Types

TypeDescriptionConfig Fields
emailEmail notificationsrecipients
slackSlack webhookswebhookUrl, channel, mentions
discordDiscord webhookswebhookUrl
telegramTelegram botbotToken, chatId
pagerdutyPagerDuty incidentsintegrationKey, severity
opsgenieOpsgenie alertsapiKey, priority
webhookCustom HTTP endpointurl, method, headers

Escalation Policies

Escalate alerts to different channels if not acknowledged:

{
  "conditions": {
    "consecutiveFailures": 3
  },
  "channels": ["chan_slack"],
  "escalationPolicy": {
    "delayMinutes": 15,
    "escalateToChannels": ["chan_pagerduty"],
    "maxEscalations": 2
  }
}

# Timeline:
# 0 min: Alert to Slack
# 15 min (if not ack): Escalate to PagerDuty
# 30 min (if not ack): Escalate again (up to maxEscalations)

Alert Silencing

Silence Alerts for Monitor

POST /v1/monitors/:id/silence
Content-Type: application/json

{
  "duration": 3600,  # Seconds
  "reason": "Scheduled maintenance"
}

# Response: 200 OK
{
  "silencedUntil": "2026-02-13T13:00:00Z",
  "reason": "Scheduled maintenance"
}

Unsilence Alerts

DELETE /v1/monitors/:id/silence

# Response: 204 No Content

Code Examples

Python

from blacktide import Client

client = Client(api_token="YOUR_TOKEN")

# Create Slack channel
slack = client.alert_channels.create({
    "name": "Engineering",
    "type": "slack",
    "config": {
        "webhookUrl": "https://hooks.slack.com/...",
        "channel": "#alerts"
    }
})

# Create alert rule
rule = client.alert_rules.create({
    "name": "API Failures",
    "conditions": {"consecutiveFailures": 3},
    "channels": [slack.id],
    "monitors": ["mon_api_001"]
})

# Test channel
client.alert_channels.test(slack.id)

JavaScript/TypeScript

import { BlackTide } from '@blacktide/sdk';

const client = new BlackTide('YOUR_TOKEN');

// Create email channel
const email = await client.alertChannels.create({
  name: 'Team Email',
  type: 'email',
  config: {
    recipients: ['team@example.com']
  }
});

// Create rule with escalation
const rule = await client.alertRules.create({
  name: 'Critical Failures',
  conditions: { consecutiveFailures: 5 },
  channels: [email.id],
  escalationPolicy: {
    delayMinutes: 15,
    escalateToChannels: [pagerdutyChannel.id]
  }
});

Next Steps