Agent Studio
Adr

ADR-009: Event-Driven Call Dispatch

How backend systems integrate with Agent Studio to trigger voice calls from events

ADR-009: Event-Driven Call Dispatch

Status

Accepted

Context

Backend systems need to trigger voice calls based on various events:

  • Kafka events: User actions, system events, scheduled triggers
  • Webhook callbacks: External system notifications
  • Cron jobs: Scheduled outbound calls
  • API triggers: Direct integration from backend services

We needed to decide how external systems should integrate with Agent Studio to dispatch calls without requiring changes to the core platform for each new integration.

Requirements

  1. Stateless integration (no state stored in Agent Studio for event processing)
  2. Support for multiple event sources (Kafka, webhooks, cron, direct API)
  3. Multi-tenant isolation (each tenant's events only trigger their workflows)
  4. Minimal latency between event and call dispatch
  5. No message queue infrastructure required within Agent Studio

Decision

Use the existing REST API endpoint POST /api/v1/calls with API key authentication for all external integrations.

Integration Pattern

┌─────────────────┐     ┌──────────────────┐     ┌─────────────────┐
│  Event Source   │     │  Backend Service │     │  Agent Studio   │
│  (Kafka, etc.)  │────▶│  (Consumer)      │────▶│  REST API       │
└─────────────────┘     └──────────────────┘     └─────────────────┘
                               │                         │
                               │  POST /api/v1/calls     │
                               │  Authorization: Bearer  │
                               │  {api_key}              │
                               │                         ▼
                               │                  ┌─────────────────┐
                               │                  │  LiveKit        │
                               │                  │  Voice Call     │
                               └─────────────────▶└─────────────────┘

API Contract

Endpoint: POST /api/v1/calls

Authentication: API Key with calls:create scope

Request:

{
  "workflow_slug": "daily-health-check",
  "user_id": "user-123",
  "user_context": {
    "user_name": "John",
    "phone": "+1234567890",
    "custom_data": {}
  },
  "metadata": {
    "source": "kafka",
    "event_id": "evt-456",
    "triggered_at": "2026-01-19T10:00:00Z"
  }
}

Response:

{
  "call_id": "uuid",
  "room_name": "call-abc123",
  "token": "livekit-token",
  "url": "wss://livekit.example.com",
  "status": "pending"
}

Call Status Polling

Backend services can poll for call completion:

GET /api/v1/calls/{call_id}

Returns call status, transcript, and metrics when complete.

API Key Scopes

Recommended scopes for backend integration:

  • calls:create - Dispatch new calls
  • calls:read - Check call status and retrieve results
  • workflows:read - Validate workflow exists before dispatch

Consequences

Positive

  • Simple integration: Standard REST API, no special SDKs required
  • Language agnostic: Any language/platform can integrate via HTTP
  • Stateless: Agent Studio doesn't need to know about Kafka, RabbitMQ, etc.
  • Scalable: Backend services handle event processing and can scale independently
  • Flexible: Same API works for all event sources
  • Auditable: All calls have metadata tracking the source

Negative

  • No built-in retry: Backend service must implement retry logic for failed dispatches
  • Polling for results: No push notification when call completes (must poll or use webhooks)
  • Rate limiting: Backend must respect API rate limits (default 600 req/min per key)

Neutral

  • Backend service required: Each integration needs a service to consume events and call the API
  • API key management: Each integration needs its own API key for isolation and auditing

Example Implementations

Kafka Consumer (Python)

from aiokafka import AIOKafkaConsumer
import httpx

async def handle_event(event: dict, client: httpx.AsyncClient):
    if event["type"] == "user.login":
        await client.post(
            "https://api.example.com/api/v1/calls",
            headers={"Authorization": "Bearer as_live_xxx"},
            json={
                "workflow_slug": "welcome-call",
                "user_id": event["user_id"],
                "user_context": {"user_name": event["name"]},
                "metadata": {"source": "kafka", "event_id": event["id"]},
            },
        )

Cron Job (Bash + curl)

#!/bin/bash
# Daily health check calls for all active users

curl -X POST "https://api.example.com/api/v1/calls" \
  -H "Authorization: Bearer $API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "workflow_slug": "daily-health-check",
    "user_id": "'"$USER_ID"'",
    "user_context": {},
    "metadata": {"source": "cron", "schedule": "daily-9am"}
  }'

Alternatives Considered

1. Built-in Kafka Consumer

Rejected because:

  • Adds infrastructure complexity to Agent Studio
  • Would need to support multiple message queue systems
  • Harder to customize event processing logic per tenant

2. Webhook-based Push

Partially adopted: Tenants can configure webhook endpoints in tools for callbacks during calls, but initial dispatch remains pull-based via API.

3. gRPC API

Deferred: REST is sufficient for current scale. May consider gRPC for high-throughput scenarios in the future.

On this page