Managed Database
Every TMA.sh project on a Pro or Team plan can have its own managed D1 database — a full SQLite database running on Cloudflare’s edge network. No connection strings, no external services. Database provisioning happens when migrations are included in deploy processing.
How it works
Section titled “How it works”TMA.sh uses a developer-owned schema model:
- You define your database schema locally using Drizzle ORM (or write SQL by hand).
- You generate SQL migration files into
db/migrations/. - You commit and deploy. When migration SQL is included in build output, TMA.sh provisions a D1 database on first deploy and applies pending migrations.
The database is lazily provisioned — it only gets created when your deploy includes migration SQL statements.
Setting up your schema
Section titled “Setting up your schema”Install Drizzle ORM and drizzle-kit as dev dependencies:
bun add drizzle-ormbun add -d drizzle-kitCreate a drizzle.config.ts at the root of your project:
import { defineConfig } from 'drizzle-kit';
export default defineConfig({ dialect: 'sqlite', schema: './db/schema.ts', out: './db/migrations',});Define your schema in db/schema.ts:
import { sqliteTable, text, integer } from 'drizzle-orm/sqlite-core';
export const users = sqliteTable('users', { id: integer('id').primaryKey({ autoIncrement: true }), telegramId: integer('telegram_id').notNull().unique(), username: text('username'), score: integer('score').default(0), createdAt: integer('created_at', { mode: 'timestamp' }) .notNull() .$defaultFn(() => new Date()),});
export const items = sqliteTable('items', { id: integer('id').primaryKey({ autoIncrement: true }), userId: integer('user_id') .notNull() .references(() => users.id), name: text('name').notNull(), rarity: text('rarity', { enum: ['common', 'rare', 'legendary'] }).notNull(),});Generating migrations
Section titled “Generating migrations”Run drizzle-kit to generate SQL migration files:
npx drizzle-kit generateThis creates timestamped .sql files in db/migrations/:
db/ schema.ts migrations/ 0000_initial.sql meta/ _journal.json 0000_snapshot.jsonEach .sql file contains the DDL statements for that migration step. Commit these files to your repository.
Deploying
Section titled “Deploying”Push your code as usual. During deploy processing, TMA.sh applies migrations when migration SQL is present in build output:
- First deploy with migrations: Provisions a new D1 database for the project and applies all migration statements.
- Subsequent deploys: Applies only new (pending) migrations. Already-applied migrations are skipped.
If no migration SQL is included in deploy processing, the migration step is skipped.
Accessing the database in API routes
Section titled “Accessing the database in API routes”The database is available as the DB binding in your API routes. Use it with Drizzle ORM for typed queries or with the raw D1 API for direct SQL.
With Drizzle ORM (recommended)
Section titled “With Drizzle ORM (recommended)”import { Hono } from 'hono';import { drizzle } from 'drizzle-orm/d1';import { eq } from 'drizzle-orm';import * as schema from '../../db/schema';
type Env = { Bindings: { DB: D1Database; };};
const app = new Hono<Env>();
app.get('/api/users/:id', async (c) => { const db = drizzle(c.env.DB, { schema }); const userId = Number(c.req.param('id'));
const user = await db .select() .from(schema.users) .where(eq(schema.users.telegramId, userId)) .get();
if (!user) { return c.json({ error: 'User not found' }, 404); }
return c.json(user);});
app.post('/api/users', async (c) => { const db = drizzle(c.env.DB, { schema }); const { telegramId, username } = await c.req.json();
const user = await db .insert(schema.users) .values({ telegramId, username }) .returning() .get();
return c.json(user, 201);});
export default app;With the raw D1 API
Section titled “With the raw D1 API”app.get('/api/leaderboard', async (c) => { const result = await c.env.DB.prepare( 'SELECT telegram_id, username, score FROM users ORDER BY score DESC LIMIT 50' ).all();
return c.json(result.results);});
app.get('/api/users/:id/items', async (c) => { const userId = c.req.param('id');
const result = await c.env.DB.prepare( 'SELECT * FROM items WHERE user_id = ?' ) .bind(userId) .all();
return c.json(result.results);});Always use parameterized queries (.bind()) to prevent SQL injection.
Tier requirements
Section titled “Tier requirements”Managed databases are not available on the Free plan.
| Plan | Database storage |
|---|---|
| Free | Not available |
| Pro | 500 MB |
| Team | 2 GB |
Storage quota is checked before each migration run at deploy time. If applying a migration would exceed your plan’s storage limit, the deploy will fail with a quota error. Upgrade your plan or reduce stored data to continue.
Limitations
Section titled “Limitations”- One database per project — each project gets a single D1 database.
- Migrations only at deploy time — you cannot run ad-hoc migrations outside of a deploy.
- No direct access — there is no connection string or external endpoint. The database is only accessible via the
DBbinding in API routes. - SQLite semantics — D1 is SQLite. Some PostgreSQL or MySQL features (stored procedures, advanced JSON operators, certain window functions) are not available.
- Max query execution time — 30 seconds per query.
- D1 per-database limit — 10 GB hard ceiling regardless of plan.
Common use cases
Section titled “Common use cases”- User profiles — store structured user data with relational queries
- Game state — leaderboards, inventory, achievements with proper indexing
- Orders and transactions — track purchases and payment history
- Content management — store app content that needs filtering, sorting, and pagination
When to use something else
Section titled “When to use something else”For use cases that require features beyond what D1/SQLite provides, connect to an external database from your API routes:
- Real-time subscriptions — Supabase (PostgreSQL with real-time)
- Global replication — Turso (libSQL with edge replicas)
- Full-text search — a dedicated search service like Meilisearch
See API Routes for examples of connecting to external databases.