feat(poc): replace Docker-based smoke tests with native Node.js harness#2231
feat(poc): replace Docker-based smoke tests with native Node.js harness#2231rostalan wants to merge 2 commits intoredhat-developer:mainfrom
Conversation
Boot a minimal Backstage backend directly on the runner using createBackend() + dynamicPluginsFeatureLoader, probe /api/<pluginId> routes, and report results as structured JSON. Includes core bundled plugins (catalog, auth, permission, scaffolder, events, search, proxy) so dynamic plugins resolve their dependencies correctly. Made-with: Cursor
|
Skipping CI for Draft Pull Request. |
|
/publish |
|
PR action ( |
|
|
…aded-plugin probe Add two-layer frontend plugin validation to the smoke test harness: - Layer 1: static validation of dist-scalprum/ after OCI download - Layer 2: inline backend probe plugin querying dynamicPluginsServiceRef Made-with: Cursor
|
|
The file You should update the |
|
PR needs rebase. DetailsInstructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. |
| } | ||
|
|
||
| // --------------------------------------------------------------------------- | ||
| // OCI Download (mirrors install-dynamic-plugins.py §663-715) |
There was a problem hiding this comment.
The comment itself acknowledges this mirrors install-dynamic-plugins.py. Should we just call the Python script directly as a pre-step instead of reimplementing it in JS?
The overlay repo currently gets the script from the RHDH container image — by removing Docker, we lose access to it. But we could copy the script here (with a version marker) as a short-term solution. That would:
- Eliminate ~60 lines of JS OCI code
- Inherit all security protections for free (zip bomb, symlink traversal, integrity checks)
- Leverage the 130KB+ test suite that already covers this logic
Longer-term options: publish as a pip package or container tool so any repo can consume it.
For context, there's also a parallel POC in RHDH core (redhat-developer/rhdh#4523) that extends the same Python script with parallel downloads (ThreadPoolExecutor) achieving ~34% speedup. Reusing it here would get that performance benefit too.
|
|
||
| function extractPlugin(tarFile, pluginPath, dest) { | ||
| mkdirSync(join(dest, pluginPath), { recursive: true }); | ||
| execSync(`tar xf "${tarFile}" -C "${dest}" "${pluginPath}/"`, { |
There was a problem hiding this comment.
This extracts tar contents with a bare execSync('tar xf ...') — no validation of archive members before extraction. The Python script in RHDH core (install-dynamic-plugins.py) checks for:
- Zip bomb:
member.size > MAX_ENTRY_SIZE(20MB default per entry) - Symlink traversal:
os.path.realpath()validation against the destination directory - Hardlink traversal: same check
- Device files/FIFOs: rejected entirely
- Safe tar filter:
tar.extractall(..., filter='tar')
Even in CI, a crafted OCI layer could exploit path traversal via symlinks. If we keep the JS implementation, all of these checks need to be ported.
| if (existsSync(p)) process.argv.push("--config", p); | ||
| } | ||
|
|
||
| const { createBackend } = await import("@backstage/backend-defaults"); |
There was a problem hiding this comment.
createBackend() is the production API. Backstage provides startTestBackend() from @backstage/backend-test-utils specifically for this use case — it handles lifecycle management, automatic cleanup, built-in in-memory SQLite, and exposes mock services. It would also reduce the manual plugin registration below (~20 lines of backend.add() calls).
redhat-developer/rhdh#4523 uses startTestBackend() for the same plugin loadability validation and it works well.
| // Config generation | ||
| // --------------------------------------------------------------------------- | ||
|
|
||
| function deepMerge(src, dst) { |
There was a problem hiding this comment.
This reimplements config deep-merge that already exists in install-dynamic-plugins.py (the merge() function). The Python version also detects key conflicts and raises errors — this one silently overwrites. If we use the Python script as a pre-step, this function becomes unnecessary since the script already generates app-config.dynamic-plugins.yaml with all plugin configs merged.
| ); | ||
|
|
||
| backend.add( | ||
| createBackendPlugin({ |
There was a problem hiding this comment.
This probe plugin concept is great — using dynamicPluginsServiceRef to verify frontend plugins actually loaded at runtime is more thorough than filesystem-only checks. Worth keeping regardless of the OCI download approach. The sanity check POC in RHDH core (redhat-developer/rhdh#4523) currently only validates bundles via filesystem (dist-scalprum/plugin-manifest.json); this runtime probe approach would complement it nicely.
| }); | ||
| } | ||
|
|
||
| async function downloadPlugins(plugins, dest) { |
There was a problem hiding this comment.
Downloads are sequential here. The install-dynamic-plugins-fast.py in redhat-developer/rhdh#4523 uses ThreadPoolExecutor for parallel OCI downloads with a shared image cache (one download per unique image, not per plugin), achieving ~34% speedup. Another reason to reuse the existing script rather than maintaining a separate implementation.


Boot a minimal Backstage backend directly on the runner using
createBackend()+dynamicPluginsFeatureLoader, probe/api/<pluginId>routes, and report results as structured JSON. Includes core bundled plugins (catalog, auth, permission, scaffolder, events, search, proxy) so dynamic plugins resolve their dependencies correctly.Frontend plugins are also validated via two layers:
dist-scalprum/exists and contains JavaScript files after OCI downloaddynamicPluginsServiceRefto confirm frontend plugins were registered without errorsResults are merged into a single report with per-plugin status (
pass,fail-bundle,fail-load,warn,skip) and the process exits non-zero on any failure.Made-with: Cursor