client.database
Type-safe Postgres for your agent — declare the schema, generate a migration, query with Drizzle's full surface.
client.database
Type-safe Postgres for your agent. You declare a schema in TypeScript, the CLI emits the SQL migration, and
client.databasehands you a Drizzle handle bound to the agent's connection string. No connection string juggling, no second ORM, no codegen step.
What you get
- A lazy Drizzle ORM handle on
client.database, bound toSTACKBONE_POSTGRES_URL. - The Drizzle column builders and helpers re-exported through
@stackbone/sdk/db, so your agent'spackage.jsononly depends on@stackbone/sdk. - A migration engine driven by the
stackbone db migrate ...commands — applied with an advisory lock and a journal table so it is safe to run on every boot. - A
.stackbone/migrations/folder that lives in your repo, gets committed alongside your code, and is replayed on every environment in declared order.
Tutorial — from zero to a queryable table
This walkthrough assumes you have a project scaffolded with
stackbone init. The hello-world template ships with the wiring
already in place; you can follow along on a fresh clone or in your own
agent.
Step 1 — declare the schema
Open src/schema.ts and describe your tables with pgTable and the
column helpers. Always import from @stackbone/sdk/db:
// src/schema.ts
import { integer, pgTable, serial, text, timestamp } from '@stackbone/sdk/db';
export const leads = pgTable('leads', {
id: serial('id').primaryKey(),
email: text('email').notNull(),
status: text('status').notNull().default('new'),
score: integer('score').notNull().default(0),
createdAt: timestamp('created_at').defaultNow().notNull(),
});Anything Drizzle exports — column types, helpers like eq, and,
sql, relations, plus the inference types InferSelectModel /
InferInsertModel — is reachable through @stackbone/sdk/db. Never
import from drizzle-orm directly: the SDK barrel is the only
contract the platform supports.
Step 2 — generate the migration
Once the schema is the way you want it, ask the CLI to compare it against the current state of your migrations folder and emit the next SQL file:
stackbone db migrate create add_leadsThat writes a new file under .stackbone/migrations/ with a numeric
prefix (e.g. 0001_add_leads.sql) plus the matching journal entry
under meta/. Commit both — they are part of your agent's
source code.
Step 3 — apply the migration
Apply pending migrations to the database your agent points at:
stackbone db migrate upThe command takes an advisory lock so two boots cannot apply the same file twice, records each successful migration in a journal table, and exits cleanly if there is nothing to apply. Running it again is a no-op.
stackbone dev runs migrate up automatically before booting your
agent when dev.autoMigrate: true is set on agent.yaml (see the
agent.yaml configuration section).
Step 4 — query with Drizzle
Inside your agent code, reach for client.database and use the
familiar Drizzle surface:
import { createClient } from '@stackbone/sdk';
import { eq } from '@stackbone/sdk/db';
import { leads } from './schema';
const stackbone = createClient();
// Read.
const qualified = await stackbone.database
.select()
.from(leads)
.where(eq(leads.status, 'qualified'))
.limit(20);
// Write.
await stackbone.database
.insert(leads)
.values({ email: 'jane@example.com', status: 'new', score: 0 });
// Update.
await stackbone.database.update(leads).set({ status: 'contacted' }).where(eq(leads.id, 1));
// Delete.
await stackbone.database.delete(leads).where(eq(leads.id, 1));Row types flow through end to end. The data returned by select()
is typed from your pgTable definition with no codegen step.
Step 5 — wrap a unit of work in a transaction
Interactive transactions yield a tx with the same surface as
client.database:
import { sql } from '@stackbone/sdk/db';
await stackbone.database.transaction(async (tx) => {
await tx.insert(leads).values({ email: 'a@example.com', status: 'new' });
await tx.execute(sql`UPDATE leads SET score = score + 1 WHERE email = ${'a@example.com'}`);
});If the callback throws, the transaction rolls back. If the underlying connection was suspended between calls (cold-start on serverless Postgres), the SDK rebuilds the pool and replays the transaction once before surfacing the error.
agent.yaml configuration
The database block on agent.yaml controls where the CLI looks for
your schema and migrations. All keys are optional and ship with sane
defaults:
database:
schema: ./src/schema.ts # default
migrations: ./.stackbone/migrations # default
dev:
autoMigrate: false # default; set true to migrate on every `stackbone dev` bootdatabase.schema— the TypeScript file the migration engine introspects when generating SQL. Override when your project layout differs (db/schema/index.ts,src/db/schema.ts, …).database.migrations— the folder where migration files land. Keep it inside the repo and committed.dev.autoMigrate— whentrue,stackbone devrunsstackbone db migrate upbefore booting the agent and re-runs it whenever the migrations folder changes. Default isfalseso a noisy schema cannot silently mutate your local database.
Both path values are kept verbatim — the CLI resolves them against
the project root at command execution time, so they "just work"
regardless of which subdirectory you ran stackbone from.
CLI commands
Every database operation goes through the stackbone db subcommand
tree. Each command exits non-zero on failure and supports --json
for machine-readable output.
stackbone db migrate create <name>
Compares your schema against the current migrations folder and emits
the next SQL file under database.migrations. Run it after every
schema edit. The <name> becomes part of the filename (e.g.
0003_add_leads.sql).
stackbone db migrate up
Applies pending migrations to the database pointed at by
STACKBONE_POSTGRES_URL. Uses an advisory lock and a journal table,
so concurrent boots cannot double-apply a migration. Idempotent — a
second run with no pending files is a no-op.
stackbone db migrate status
Reports which migrations are applied, which are pending, and whether the database has drifted from the journal (e.g. an entry exists in the database but not in the folder). Informational only — never mutates state.
stackbone db migrate add-rag
Drops the canonical stackbone_rag_v1 schema as a numbered migration
under database.migrations. Idempotent — re-running is a no-op when
the marker -- @stackbone:rag@v1 is already present in the folder.
Run it once before your first client.rag.* call, then commit the
generated SQL alongside the rest of your migrations. Full surface
documented under client.rag → Setup.
stackbone db studio
Opens the local schema explorer against STACKBONE_POSTGRES_URL. A
read/write GUI for browsing rows, running ad-hoc SQL and inspecting
indexes during development. Stays alive until Ctrl-C.
pgvector
If your agent needs vector search, declare the column with the
vector helper from @stackbone/sdk/db:
import { pgTable, text, vector } from '@stackbone/sdk/db';
export const documents = pgTable('documents', {
id: text('id').primaryKey(),
content: text('content').notNull(),
embedding: vector('embedding', { dimensions: 1536 }).notNull(),
});The first migration that introduces a vector(...) column emits the
matching CREATE EXTENSION IF NOT EXISTS vector ahead of the
CREATE TABLE. No manual extension management needed.
For higher-level retrieval (parsing, chunking, embedding, hybrid
search), look at client.rag — it builds on the same
pgvector storage but adds the RAG ergonomics on top.
Connection lifecycle
client.database is lazy: the underlying Postgres pool is built
the first time you touch it, not at createClient() time. This is
deliberate so that env vars rotated by the platform after
createClient() are still picked up.
If your agent runs on a serverless host that suspends the container between requests (Fly Machines, SSR), a stale pool is detected on the next call and the SDK transparently rebuilds it. From your code's point of view, the query just succeeds.
Contract gating
The terminal await of every Drizzle chain
(select/insert/update/delete/execute/transaction) consults the
contract handshake before
dispatching. The required capability is database.postgres_direct;
when the negotiated contract does not advertise it (or the version
floor is missed), the chain throws a tagged
Error & { code, message, meta } with code set to
capability_unavailable or contract_version_unsupported. The
code matches the rest of the SDK so the same try/catch works
for setup-bug errors like database_not_configured.
Two surfaces are intentionally ungated because they are synchronous accessors the user is opting into:
client.database.query— Drizzle's relational query builder.client.database.raw()— the underlying Drizzle handle.
Use them when you genuinely need the escape hatch.
Testing — @stackbone/sdk/db/testing
For tests that need a real Postgres without colliding with other
tests, the SDK ships a helper that provisions an ephemeral schema
inside the dev database, applies your migrations against it, and
hands back a Drizzle handle plus a dispose() for tear-down:
import { createTestDatabase } from '@stackbone/sdk/db/testing';
import { resolve } from 'node:path';
import { leads } from '../src/schema';
const testDb = await createTestDatabase({
migrationsDir: resolve(__dirname, '../.stackbone/migrations'),
});
await testDb.drizzle.insert(leads).values({ email: 'a@example.com', status: 'new' });
const rows = await testDb.drizzle.select().from(leads);
await testDb.dispose();Each call generates a unique random schema name, so parallel tests
never see each other's rows. dispose() drops the schema with
CASCADE, wiping creator-owned tables and the migration engine's
own journal in one statement.
By default the helper resolves the connection string in this order:
STACKBONE_TEST_POSTGRES_URL, then STACKBONE_POSTGRES_URL, then
DATABASE_URL, then the dev compose default. Pass
connectionString to override.
Escape hatch
Everything client.database does is plain Drizzle. The connection
string (STACKBONE_POSTGRES_URL) is injected into the container, and
nothing stops you from instantiating Drizzle (or postgres-js)
directly when you need a feature the SDK does not surface yet:
import { drizzle } from 'drizzle-orm/postgres-js';
import postgres from 'postgres';
const sql = postgres(process.env['STACKBONE_POSTGRES_URL']!);
const db = drizzle(sql);The SDK is a thin convenience layer, not a wall. Drop down to upstream when it pays off, then come back when it doesn't.
Where to go next
@stackbone/sdkoverview — the rest of the modules: storage, AI, RAG, approval, secrets, config.- Persistence tutorial — a
guided end-to-end build that exercises
client.databaseandclient.storagetogether. - Local development — how
stackbone devboots Postgres locally and injects the env varsclient.databasereads.