Skip to content

Build a Customer Support Agent

This guide covers building a single agent that queries a knowledge base and sends the final answer back to an external helpdesk system via a Webhook.

Step 1: The Knowledge Base Connection

We'll use the vector_search tool. This assumes you've pushed your Zendesk/Intercom articles into Savine's persistent memory layer.

support-agent.json

json
{
  "name": "tier-1-support",
  "llm": {
    "provider": "openai",
    "model": "gpt-4o",
    "key_ref": "OPENAI_API_KEY"
  },
  "tools": ["vector_search"],
  "config": {
    "memory_type": "persistent",
    "system_prompt": "You are a polite Tier 1 support rep. Always use `vector_search` to find relevant docs. Never guess an answer. If you don't know, say 'I will escalate this'."
  },
  "webhooks": {
    "on_complete": "https://api.yourhelpdesk.com/v1/savine-callback"
  }
}

Step 2: The Webhook Integration

In Savine, tasks are fundamentally asynchronous. A customer support app shouldn't block an HTTP thread waiting 15 seconds for an LLM to think.

When you submit a task:

bash
curl -X POST https://api.savine.dev/v1/tasks \
  -H "X-API-Key: sk_live_..." \
  -d '{
    "agentId": "tier-1-support",
    "input": "How do I reset my password?",
    "metadata": { "ticket_id": "ABC-123" }
  }'

You get back a 202 Accepted immediately with the Task ID.

The Callback: When the agent finishes, Savine sends an HTTP POST request to your on_complete Webhook URL.

json
{
  "task_id": "tsk_89xyz",
  "status": "COMPLETED",
  "metadata": { "ticket_id": "ABC-123" },
  "output": "To reset your password, visit your account settings...",
  "metrics": { "cost_usd": 0.002 }
}

Step 3: Deployment

  1. Set your platform API key: savine config set OPENAI_API_KEY="sk-..."
  2. Deploy the agent: savine agents deploy --file support-agent.json
  3. Your webhooks will now automatically fire upon task completion, routing the AI response directly into your Zendesk ticket via your middleware.