Replicate rows from an external SQL source into the internal Durable Object SQLite database.
GET /data-syncList configured sync tasks.POST /data-syncCreate or update a task.DELETE /data-sync/:nameDelete a task.POST /data-sync/run/:nameRun a task immediately.
curl --location 'https://starbasedb.YOUR-ID.workers.dev/data-sync' \
--header 'Authorization: Bearer ADMIN_TOKEN' \
--header 'Content-Type: application/json' \
--data '{
"name": "users_sync",
"sourceTable": "users",
"sourceSchema": "public",
"targetTable": "users_cache",
"cursorColumn": "id",
"intervalCron": "*/5 * * * *",
"batchSize": 250
}'nameUnique task identifier (letters, numbers, underscores).sourceTableExternal table to pull from.sourceSchemaOptional external schema for PostgreSQL/MySQL.targetTableInternal SQLite table to replicate into.cursorColumnIncremental checkpoint column (must be monotonic).intervalCronCron expression used by the cron plugin.batchSizeOptional pull size (defaults to 250, min 10, max 2000).
- First run pulls earliest rows ordered by
cursorColumn. - Later runs pull rows where
cursorColumn > last_cursor_value. - Target table is created automatically when missing.
- Missing target columns are added automatically.
- Upserts are conflict-safe using a unique index on
cursorColumn.
- External SQL datasource must be configured in StarbaseDB.
- Cron plugin must be enabled to run scheduled pulls.