You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: add allowDataLoss parameter to flow compilation (#545)
# Add `allow_data_loss` Parameter to Flow Compilation
This PR adds a new `allow_data_loss` parameter to the `pgflow.ensure_flow_compiled` function, enabling destructive flow recompilation in production environments when explicitly enabled.
## Key Changes
- Added an optional `allow_data_loss` boolean parameter (default: false) to the `pgflow.ensure_flow_compiled` function
- Modified the recompilation logic to check `v_is_local OR allow_data_loss` before allowing destructive operations
- Updated the edge worker API to support a new `compilation` configuration object that can include `allowDataLoss: true`
- Replaced the deprecated `ensureCompiledOnStartup` boolean with the more flexible `compilation` config
- Added comprehensive tests for the new parameter in both SQL and TypeScript
## Benefits
- Provides a safe way to iterate on flow designs in production environments when needed
- Maintains the default safe behavior (no data loss in production) while adding an explicit opt-in for destructive changes
- Improves developer experience by allowing controlled recompilation without environment switching
## Usage
```typescript
// Allow destructive recompilation in any environment
const worker = createFlowWorker(MyFlow, {
compilation: { allowDataLoss: true },
// other config...
});
// Skip compilation entirely (pre-compiled flows)
const worker = createFlowWorker(MyFlow, {
compilation: false,
// other config...
});
```
0 commit comments