Docs/Guides/OpenAI Integration

OpenAI Integration

Step-by-step guide to using OpenAI through Keystore. Give your AI agents access to GPT models without exposing API keys.


OpenAI Integration

This guide walks through connecting an AI agent to OpenAI via Keystore. Your agent uses a ks_ token instead of a real OpenAI API key. The Keystore proxy resolves the real credentials at request time.

Prerequisites

  • A Keystore account with an OpenAI provider configured
  • An agent token (starts with ks_)
  • Node.js 18+

If you do not have these yet, use the CLI to set them up:

bash
1
2
3
keystore login
keystore providers add        # choose openai, byok, paste your key
keystore agents create        # save the ks_ token

Install Dependencies

bash
1
npm install openai @keystore/sdk

Option A: Wrap the Client

The wrap method rewrites the OpenAI client's base URL and API key to point at the Keystore proxy. This is the most explicit approach.

typescript
1
2
3
4
5
6
7
8
9
10
11
12
import OpenAI from "openai";
import { Keystore } from "@keystore/sdk";

const ks = new Keystore({ agentToken: "ks_abc123..." });
const openai = ks.wrap(new OpenAI());

const completion = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Explain quantum computing in one sentence." }],
});

console.log(completion.choices[0].message.content);

You can also use the provider-specific wrapper directly:

typescript
1
2
3
4
5
import OpenAI from "openai";
import { Keystore, wrapOpenAI } from "@keystore/sdk";

const ks = new Keystore({ agentToken: "ks_abc123..." });
const openai = wrapOpenAI(new OpenAI(), ks);

Option B: Intercept All Fetch Requests

interceptAll patches globalThis.fetch to transparently reroute requests to supported provider domains through the proxy. No changes to client initialization required.

typescript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
import OpenAI from "openai";
import { Keystore } from "@keystore/sdk";

const ks = new Keystore({ agentToken: "ks_abc123..." });
ks.interceptAll(["openai"]);

// The OpenAI client works normally -- requests are rerouted automatically.
const openai = new OpenAI({ apiKey: "unused" });

const completion = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "What is the capital of France?" }],
});

console.log(completion.choices[0].message.content);

// Restore the original fetch when done.
ks.restore();

Omitting the provider list intercepts all supported providers:

typescript
1
ks.interceptAll(); // intercepts openai, anthropic, neon, resend, vercel

Option C: Set Environment Variables

Use setupEnv to write OPENAI_BASE_URL and OPENAI_API_KEY into process.env. Useful when the OpenAI client is initialized elsewhere in your codebase or by a framework.

typescript
1
2
3
4
5
6
7
8
import { Keystore } from "@keystore/sdk";

const ks = new Keystore({ agentToken: "ks_abc123..." });
ks.setupEnv(["openai"]);

// Later, anywhere in your app:
import OpenAI from "openai";
const openai = new OpenAI(); // reads OPENAI_BASE_URL and OPENAI_API_KEY from env

Streaming

All approaches work with streaming. The proxy forwards the stream from OpenAI to your agent.

typescript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
import OpenAI from "openai";
import { Keystore } from "@keystore/sdk";

const ks = new Keystore({ agentToken: "ks_abc123..." });
const openai = ks.wrap(new OpenAI());

const stream = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Write a haiku about APIs." }],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}

Function Calling

Works identically to the standard OpenAI SDK. Keystore is transparent at the protocol level.

typescript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
const completion = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "What is the weather in Tokyo?" }],
  tools: [
    {
      type: "function",
      function: {
        name: "get_weather",
        description: "Get current weather for a location",
        parameters: {
          type: "object",
          properties: {
            location: { type: "string" },
          },
          required: ["location"],
        },
      },
    },
  ],
});

Monitoring

After making requests, use the CLI to check usage and logs:

bash
1
2
keystore usage ag_abc123
keystore logs ag_abc123 --provider openai

Each proxied request is logged with the provider, path, status code, latency, and cost.