Agent-Powered Email Notifications with Resend and Keystore
beginnerwrap·openairesend·3 min read
Agent-Powered Email Notifications with Resend and Keystore
Keystore isn't just for LLM providers. Any HTTP-based API can be routed through the vault — including Resend for email. This example shows how to wrap both OpenAI and Resend, building an agent that detects anomalies and sends alerts.
What you'll build
An anomaly detection agent that:
- Analyzes a dataset with OpenAI
- Detects unusual patterns
- Sends an email alert via Resend if anomalies are found
Both API keys are resolved from the Keystore vault.
Prerequisites
- A Keystore account with an agent token
- OpenAI and Resend keys in your vault
- Node.js 18+
Setup
1
Install dependencies
bash
1
npm install @keystore/sdk openai resend2
Wrap both clients
typescript
1
2
3
4
5
6
7
8
9
import Keystore from "@keystore/sdk";
import OpenAI from "openai";
import { Resend } from "resend";
const ks = new Keystore({ agentToken: process.env.KS_TOKEN! });
// Wrap each client individually — both route through the vault
const openai = ks.wrap(new OpenAI());
const resend = ks.wrap(new Resend("placeholder"));i
The "placeholder" API key passed to Resend is replaced by the vault at request time. It doesn't matter what you put here — the real key is never in your code.
3
Analyze data for anomalies
typescript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
const metrics = [
{ date: "2026-03-01", requests: 1200, errors: 3, latency_ms: 45 },
{ date: "2026-03-02", requests: 1150, errors: 2, latency_ms: 42 },
{ date: "2026-03-03", requests: 1180, errors: 85, latency_ms: 320 },
{ date: "2026-03-04", requests: 1220, errors: 4, latency_ms: 48 },
];
const analysis = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
{
role: "system",
content:
"You are a monitoring agent. Analyze metrics and identify anomalies. " +
"Respond with JSON: { anomalies: [{ date, metric, value, reason }], summary: string }",
},
{
role: "user",
content: `Analyze these daily metrics:\n${JSON.stringify(metrics, null, 2)}`,
},
],
response_format: { type: "json_object" },
});
const result = JSON.parse(analysis.choices[0].message.content!);4
Send alert email if anomalies found
typescript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
if (result.anomalies.length > 0) {
const anomalyList = result.anomalies
.map((a: any) => `- ${a.date}: ${a.metric} = ${a.value} (${a.reason})`)
.join("\n");
await resend.emails.send({
from: "monitoring@yourapp.com",
to: "ops-team@yourapp.com",
subject: `Alert: ${result.anomalies.length} anomalies detected`,
text: `${result.summary}\n\nAnomalies:\n${anomalyList}`,
});
console.log("Alert email sent");
} else {
console.log("No anomalies detected");
}Full example
typescript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
import Keystore from "@keystore/sdk";
import OpenAI from "openai";
import { Resend } from "resend";
async function monitor() {
const ks = new Keystore({ agentToken: process.env.KS_TOKEN! });
const openai = ks.wrap(new OpenAI());
const resend = ks.wrap(new Resend("placeholder"));
// Simulate fetching metrics
const metrics = [
{ date: "2026-03-01", requests: 1200, errors: 3, latency_ms: 45 },
{ date: "2026-03-02", requests: 1150, errors: 2, latency_ms: 42 },
{ date: "2026-03-03", requests: 1180, errors: 85, latency_ms: 320 },
{ date: "2026-03-04", requests: 1220, errors: 4, latency_ms: 48 },
];
const analysis = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
{
role: "system",
content:
"Analyze metrics for anomalies. Respond with JSON: " +
"{ anomalies: [{ date, metric, value, reason }], summary: string }",
},
{ role: "user", content: JSON.stringify(metrics) },
],
response_format: { type: "json_object" },
});
const result = JSON.parse(analysis.choices[0].message.content!);
if (result.anomalies.length > 0) {
await resend.emails.send({
from: "monitoring@yourapp.com",
to: "ops-team@yourapp.com",
subject: `Alert: ${result.anomalies.length} anomalies detected`,
text: result.summary,
});
console.log("Alert sent:", result.summary);
} else {
console.log("All clear");
}
}
monitor();Why wrap non-LLM providers?
The same security benefits apply to every API:
- No exposed keys — Resend API key stays in the vault
- Audit trail — every email send is logged
- Budget controls — limit how many emails an agent can send
- Kill switch — revoke the agent token to stop all API access instantly
Next steps
- Add database access with Neon Postgres
- Try S3 file uploads for storing reports
- Set up production webhooks for budget alerts