Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 62 additions & 1 deletion packages/spec/scripts/build-schemas.ts
Original file line number Diff line number Diff line change
Expand Up @@ -74,10 +74,71 @@ function writeFileWithRetry(filePath: string, content: string, retries = MAX_RET
}
}

/**
* Recursively remove directory with retry logic
* More robust than fs.rmSync for handling ENOTEMPTY errors in CI
*
* @param dirPath - The path to the directory to remove
* @param retries - Maximum number of retry attempts (default: MAX_RETRIES)
* @throws {Error} When directory cannot be removed after all retries are exhausted
*/
function removeDirRecursive(dirPath: string, retries = MAX_RETRIES): void {
for (let attempt = 0; attempt < retries; attempt++) {
try {
if (!fs.existsSync(dirPath)) {
return; // Already removed
}

// Read all entries in the directory
const entries = fs.readdirSync(dirPath, { withFileTypes: true });

// Process each entry
for (const entry of entries) {
const fullPath = path.join(dirPath, entry.name);

if (entry.isDirectory()) {
// Recursively remove subdirectory
try {
removeDirRecursive(fullPath, retries);
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The recursive call passes the same retries parameter to subdirectories, which means the outer retry loop (line 86) can retry the entire directory removal multiple times, and each subdirectory removal will also retry multiple times. This creates a multiplicative retry effect (e.g., with MAX_RETRIES=3, a nested directory could see up to 333=27 attempts for deeply nested structures). This can significantly increase build times in CI.

Consider passing retries: 1 to recursive calls or removing the outer retry loop, since the inner operations already have retry logic.

Suggested change
removeDirRecursive(fullPath, retries);
removeDirRecursive(fullPath, 1);

Copilot uses AI. Check for mistakes.
} catch (error) {
throw new Error(`Failed to remove subdirectory ${fullPath}: ${error}`);
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wrapping the subdirectory removal error in a new Error object loses the original error's stack trace and type information. This makes debugging CI failures more difficult, which contradicts the PR's goal of improving reliability.

Instead of wrapping, re-throw the original error or use error cause chaining (available in Node.js 16.9.0+): throw new Error(Failed to remove subdirectory ${fullPath}, { cause: error }).

Suggested change
throw new Error(`Failed to remove subdirectory ${fullPath}: ${error}`);
throw new Error(`Failed to remove subdirectory ${fullPath}`, { cause: error });

Copilot uses AI. Check for mistakes.
Comment on lines +100 to +104
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The nested try-catch for subdirectory removal (lines 101-105) will catch errors and re-throw them, bypassing the file-level retry logic. This means if a subdirectory removal fails on the first attempt, it won't be retried by the outer loop at line 86 because the error is thrown immediately, preventing the loop from continuing to process remaining entries.

This inconsistency means subdirectories get different error handling than the main directory, potentially making the function less reliable for the nested structures it's meant to handle.

Suggested change
// Recursively remove subdirectory
try {
removeDirRecursive(fullPath, retries);
} catch (error) {
throw new Error(`Failed to remove subdirectory ${fullPath}: ${error}`);
// Recursively remove subdirectory with retry
for (let dirRetry = 0; dirRetry < retries; dirRetry++) {
try {
removeDirRecursive(fullPath, retries);
break;
} catch (error) {
if (dirRetry === retries - 1) {
throw error;
}
sleepSync(RETRY_DELAY_BASE_MS * (dirRetry + 1));
}

Copilot uses AI. Check for mistakes.
}
} else {
// Remove file with retry
for (let fileRetry = 0; fileRetry < retries; fileRetry++) {
try {
fs.unlinkSync(fullPath);
break;
} catch (error) {
if (fileRetry === retries - 1) {
throw error;
}
sleepSync(RETRY_DELAY_BASE_MS * (fileRetry + 1));
}
}
}
Comment on lines +99 to +119
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function doesn't handle symbolic links, which could cause issues if the json-schema directory contains symlinks. Attempting to recurse into a symlink to a directory or remove a symlink as a file could fail. The original fs.rmSync handles symlinks correctly.

Add a check for symlinks using entry.isSymbolicLink() and handle them appropriately (e.g., using fs.unlinkSync() regardless of the target type).

Copilot uses AI. Check for mistakes.
}

// Now remove the empty directory
// Using rmdirSync for removing empty directories (works reliably across Node versions)
fs.rmdirSync(dirPath);
return; // Success

Comment on lines +122 to +126
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fs.rmdirSync() has been deprecated since Node.js 14.14.0 in favor of fs.rmSync() with the recursive option (which is being replaced in this PR). While it still works in Node 18+ (per the package.json engines requirement), using deprecated APIs is a best practice violation.

Consider using fs.rmSync(dirPath) instead of fs.rmdirSync(dirPath) for removing the empty directory, as it's the modern, non-deprecated alternative.

Suggested change
// Now remove the empty directory
// Using rmdirSync for removing empty directories (works reliably across Node versions)
fs.rmdirSync(dirPath);
return; // Success
// Now remove the empty directory using modern API (non-deprecated)
fs.rmSync(dirPath, { recursive: false, force: false });
return; // Success

Copilot uses AI. Check for mistakes.
} catch (error) {
if (attempt === retries - 1) {
throw new Error(`Failed to remove directory ${dirPath} after ${retries} attempts: ${error}`);
}
// Wait before retrying with exponential backoff
const delay = RETRY_DELAY_BASE_MS * (attempt + 1);
sleepSync(delay);
}
}
Comment on lines +86 to +135
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The outer try-catch block's retry logic (lines 86-135) may not work as intended because once the directory traversal begins (line 93), any error thrown from within the loop (lines 96-120) will be caught by the outer catch block. However, if the error occurred partway through processing entries, retrying from the beginning could encounter already-deleted files or directories, potentially causing new errors.

Consider whether the outer retry loop adds value given that individual file operations already have retry logic. If ENOTEMPTY errors occur at the rmdirSync call (line 124), the outer retry would help, but the current error handling doesn't distinguish between different failure points.

Copilot uses AI. Check for mistakes.
}

// Clean output directory ensures no stale files remain
if (fs.existsSync(OUT_DIR)) {
console.log(`Cleaning output directory: ${OUT_DIR}`);
fs.rmSync(OUT_DIR, { recursive: true, force: true, maxRetries: MAX_RETRIES, retryDelay: RETRY_DELAY_BASE_MS });
removeDirRecursive(OUT_DIR);

// Wait a bit to ensure file system has synced
// This prevents ENOENT errors on some file systems, particularly in CI environments
Expand Down
Loading