Python SDK
The envclaw Python package lets your AI agents access provider APIs through the Keystore proxy. Agents receive a ks_ token instead of real credentials -- the vault resolves real keys at request time.
Installation
pip install envclawQuick Start
from envclaw import Keystore
ks = Keystore(agent_token="ks_abc123...")The Keystore instance is the entry point for all SDK functionality. Pass your agent token (starts with ks_) to the constructor.
Wrapping Provider Clients
OpenAI
Use wrap_openai to route OpenAI requests through the Keystore proxy. The agent token replaces the API key, and the base URL is rewritten to point at the proxy.
from openai import OpenAI
from envclaw import Keystore, wrap_openai
ks = Keystore(agent_token="ks_abc123...")
client = wrap_openai(OpenAI(), ks)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello"}],
)
print(response.choices[0].message.content)Anthropic
Use wrap_anthropic for Anthropic's Claude SDK.
from anthropic import Anthropic
from envclaw import Keystore, wrap_anthropic
ks = Keystore(agent_token="ks_abc123...")
client = wrap_anthropic(Anthropic(), ks)
message = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}],
)
print(message.content[0].text)Intercepting All Requests
intercept_all patches Python's HTTP layer to route all matching provider requests through the Keystore proxy. This works with any SDK or HTTP client that hits a supported provider domain.
from envclaw import Keystore, intercept_all
ks = Keystore(agent_token="ks_abc123...")
intercept_all(ks)
# All requests to OpenAI, Anthropic, etc. now go through the proxy.
from openai import OpenAI
client = OpenAI(api_key="unused")
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello"}],
)You can limit interception to specific providers:
intercept_all(ks, providers=["openai", "anthropic"])Setting Up Environment Variables
setup_env writes provider-specific environment variables so that frameworks and SDKs that read from os.environ work automatically.
from envclaw import Keystore, setup_env
ks = Keystore(agent_token="ks_abc123...")
setup_env(ks, providers=["openai", "anthropic"])
# Environment variables are now set:
# OPENAI_BASE_URL -> proxy URL
# OPENAI_API_KEY -> ks_abc123...
# ANTHROPIC_BASE_URL -> proxy URL
# ANTHROPIC_API_KEY -> ks_abc123...This is particularly useful for frameworks like LangChain or CrewAI that initialize clients from environment variables.
Supported Providers
| Provider | Wrapper Function | Env Vars Set by setup_env |
|---|---|---|
| OpenAI | wrap_openai | OPENAI_BASE_URL, OPENAI_API_KEY |
| Anthropic | wrap_anthropic | ANTHROPIC_BASE_URL, ANTHROPIC_API_KEY |
| Neon | -- | DATABASE_URL |
| Resend | -- | RESEND_BASE_URL, RESEND_API_KEY |
| Vercel | -- | VERCEL_API_URL, VERCEL_TOKEN |
| S3 | -- | AWS_ENDPOINT_URL_S3 |
Error Handling
If an invalid or expired agent token is provided, the proxy returns a 401 error. Handle this in your application:
from openai import OpenAI, AuthenticationError
from envclaw import Keystore, wrap_openai
ks = Keystore(agent_token="ks_abc123...")
client = wrap_openai(OpenAI(), ks)
try:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello"}],
)
except AuthenticationError:
print("Agent token is invalid or expired")