The database API for agents that speak English

Your agents talk.
Your database listens.

Your agent sends plain English to one API endpoint. Polykomos translates it to SQL, executes it, and returns structured results. No SQL knowledge needed. Provision isolated PostgreSQL, MySQL, or SQLite instances, then query them in natural language — all through the same API.

natural language → SQL 3 engines, 1 API provision + query + teardown
Start building Free tier available. No credit card.
$ polykomos query
$ curl -X POST https://polykomos.com/api/v1/query_executions \
-H "Authorization: Bearer $POLYKOMOS_API_KEY" \
-d '{"database_uuid":"d7f3a1b2-...",
"query":"create a contacts table
with name, email, and phone"}'
 
{"success": true,
"sql": "CREATE TABLE \"contacts\" (...)",
"result": {"type": "ddl_batch",
"statementsExecuted": 2}}

Agents don't need SQL. They just need to describe what they want.

Describe what you want, and Polykomos handles everything else. It generates safe SQL for PostgreSQL, MySQL, or SQLite, validates it, and returns structured results — all from a single API call.

agent:sales-assistant → polykomos
# agent creates schema
"create a contacts table with name, email, phone, and company"
 
{"success": true,
"sql": "CREATE TABLE \"contacts\" (\"id\" SERIAL PRIMARY KEY, ...)",
"result": {"type": "ddl_batch", "statementsExecuted": 2}}
 
# agent populates data
"add 3 sample contacts at Acme Corp, Globex, and Initech"
 
{"success": true,
"sql": "INSERT INTO \"contacts\" ...",
"result": {"type": "write", "affectedRows": 3}}
 
# agent queries
"show all contacts from Acme Corp"
 
{"success": true,
"sql": "SELECT * FROM \"contacts\" WHERE \"company\" = 'Acme Corp'",
"result": {"type": "read",
"rows": [{"name": "Alice Chen", "email": "alice@acme.com",
"phone": "555-0142", "company": "Acme Corp"}]}}

A managed database platform — with an agent layer on top.

Polykomos is a serverless database host. Provision PostgreSQL, MySQL, or SQLite instances and connect with any standard tool — psql, any ORM, any database driver. Use it for apps, CI/CD, websites, or internal tooling exactly the way you'd use any cloud database.

The natural language endpoint is an additional capability. Your agents can use it to skip SQL entirely, but traditional SQL access is always there. Same credentials, same instance, both approaches work side by side.

$ psql
$ psql "postgresql://u_0f3a:kx9m2...@db-0f3a.polykomos.com:5432/tenant_0f3a"
 
psql (16.2)
SSL connection (protocol: TLSv1.3)
 
tenant_0f3a=> SELECT * FROM contacts LIMIT 3;
 
name | email | company
-------------+------------------+-----------
Alice Chen | alice@acme.com | Acme Corp
Bob Park | bob@globex.com | Globex
Carol Ruiz | carol@initech.io | Initech
(3 rows)

Everything your agents need in one API.

The full lifecycle in a single integration. Provision a database, query it in natural language, and tear it down when the task is done.

Natural language queries. Agents describe what they want in plain English. Polykomos generates safe SQL, validates it, and returns structured results.
Per-agent isolation. Each workload owns its own database, credentials, and lifecycle. Provision Postgres, MySQL, or SQLite with one API call. Tear it down without touching the rest.
Autonomy with accountability. Per-database quotas on storage and connections. An agent in a loop hits a ceiling, not your credit card.
agent:research-crawler
# 1. provision a database
$ POST /api/v1/databases
{"name":"crawl-8b3f", "db_type":"postgres"}
→ {"uuid":"d7f3a1b2-...", "status":"active"}
 
# 2. query in natural language
$ POST /api/v1/query_executions
{"query":"create a results table with url, title, summary"}
→ {"success": true, "statementsExecuted": 2}
 
$ POST /api/v1/query_executions
{"query":"insert a result for https://example.com"}
→ {"success": true, "affectedRows": 1}
 
# 3. teardown
$ DELETE /api/v1/databases/d7f3a1b2-...
→ {"status":"deleted"}

Built for automation

AI agents: SQL optional

Agents can send natural language to one endpoint and let Polykomos generate the SQL, or connect directly and run queries themselves. Provision, populate, query, and tear down — with or without writing SQL.

Deploy pipelines

A database per tenant, per environment, or per branch. Created in CI, torn down when the branch closes.

Internal platforms

One provisioning API behind your self-serve portal. Developers get databases without tickets.

Multi-tenant SaaS

Isolate each customer in their own instance. Onboarding provisions it, billing meters it.

Three engines. One API.

Switch only the engine selector.

Provisioning, auth, quotas, and automation stay consistent.

PostgreSQL dbType: "postgres"
MySQL dbType: "mysql"
SQLite dbType: "sqlite"

Database engines

PostgreSQL

available now

Transactional workloads, relational integrity, advanced SQL. The default for anything production-grade.

Best for core product databases

MySQL

available now

MySQL-compatible stacks, existing workload migrations, and systems that expect MySQL wire protocol.

Best for legacy integration + compatibility

SQLite

rollout in progress

Preview environments, tooling, agent scratch state. For when a full server instance would be overkill.

Best for single-tenant + utility workloads

How it works

01

Provision or query from code

One POST to /v1/databases provisions Postgres, MySQL, or SQLite. Or skip straight to /v1/query and send natural language — Polykomos generates the SQL. Your deploy script, platform controller, or agent makes the call — no console, no ticket.

02

Receive isolated credentials

The response includes a unique connection string, per-instance credentials, and a database ID for lifecycle management. Credentials are encrypted at rest and scoped to that single instance.

03

Metering and limits from the start

Every instance ships with configurable quotas: storage caps and connection limits you can set in the dashboard or API. Usage is metered hourly and alerts fire at your thresholds — before anything overruns.

Or just use the dashboard.

Not everything needs to be automated. The web console gives you the same power without writing a line of code.

One-click provisioning

Pick an engine, set your quotas, click Create. Your database is live in seconds with credentials ready to copy.

Live usage monitoring

Real-time charts for storage, runtime hours, and connection counts. See what's running, what's idle, and what's approaching limits.

Lifecycle management

Pause instances to stop compute charges, adjust quotas without downtime, or tear down databases when you're done. All from the UI.

my-databases active
engine postgresql
storage 2.4 GB / 10 GB
compute 18 hrs / 200 hrs
connections 7 / 100
// api surface

More than provisioning and queries.

Two more capabilities round out the API: schema blueprints that stamp a consistent structure onto any database in one call, and webhooks that notify your systems when events happen. Neither requires the dashboard.

Blueprints

Ship every database with the schema already in place.

A blueprint is a saved SQL schema — tables, indexes, extensions — stored in your account. Apply it to any database with one API call. Define it once, execute it everywhere. Useful for multi-tenant onboarding, CI test fixtures, or any workflow where new databases need to start with a consistent structure. You can manage blueprints in the dashboard or via the API, and apply them at provision time or any time after.

POST /api/v1/databases/:uuid/blueprint_applications
$ POST /api/v1/databases/d7f3a1b2-.../blueprint_applications
{"blueprint_uuid": "bp_8c3f..."}
 
{"success": true,
"statements_executed": 7,
"tables_created": ["users", "sessions", "events"]}
Webhooks

Get notified when things happen.

Register a URL and the events you care about. Polykomos POSTs a signed JSON payload to your endpoint whenever those events fire: a database created, deleted, a blueprint applied. Each delivery includes an HMAC-SHA256 signature via the X-Polykomos-Signature header so you can verify it came from us. Delivery history and test pings are available via the API and dashboard, so you can debug integrations without waiting for real events.

POST /api/v1/webhook_endpoints
$ POST /api/v1/webhook_endpoints
{"url": "https://your-app.com/hooks",
"events": ["database.created", "database.deleted"]}
 
{"secret": "whsec_a3f...", "status": "active"}
 
# delivery to your endpoint:
X-Polykomos-Signature: sha256=9b1f...
{"event": "database.created", "timestamp": "...",
"data": {"uuid": "d7f3a1b2-...", "db_type": "postgres"}}

// platform profile

Compute
shared_cpu 0.5 vCPU
dedicated_cpu 2 vCPU
high_perf 4 vCPU
Storage
type NVMe SSD
encryption AES-256 at rest
iops 3000 baseline
Security
tls enforced
credentials per-instance, Sodium encrypted
isolation dedicated instance per database
Backups
frequency daily
retention 7 days
pitr pro tier
Regions
us-east-1 Virginia
eu-west-1 Ireland
ap-southeast-1 Singapore
Engines
postgresql available
mysql available
sqlite rollout

// simple pricing

Free $0 no credit card
Enterprise Custom contact sales
databases
1
unlimited
runtime
100 hrs / month
custom
storage
512 MB
custom
max storage per database
512 MB
custom
connections per database
10
custom
NLQ queries
10 / day
unlimited
Runtime is charged per database, per hour it's active. Storage is charged per GB averaged over the month. Invoiced monthly. Connections are a simultaneous limit, not a billing metric.

// go deeper

Not sure where to start? These guides walk through real use cases, code examples, and setup steps for both approaches.

Provision your first database in under a minute.

Free tier. No credit card. Pick an engine and start writing data.