Skip to content

Fix v3 config loading with Jiti re-exports#448

Merged
Boshen merged 1 commit intotailwindlabs:mainfrom
Dunqing:fix/issue-431-jiti
Feb 5, 2026
Merged

Fix v3 config loading with Jiti re-exports#448
Boshen merged 1 commit intotailwindlabs:mainfrom
Dunqing:fix/issue-431-jiti

Conversation

@Dunqing
Copy link
Copy Markdown
Contributor

@Dunqing Dunqing commented Feb 5, 2026

Fixes: #431

Summary
The crash happens when a Tailwind v3 config is an ESM file that re‑exports from a workspace package which itself is loaded via Jiti (v2). Our plugin’s v3 loader used Tailwind’s loadConfig, which internally uses Jiti v1 and returns a Proxy. When that Proxy touches modules loaded via Jiti v2, the getters recursively call each other, causing RangeError: Maximum call stack size exceeded.

Where it happens

  • In src/versions/v3.ts, the old path was loadConfig(jsConfig) (Tailwind v3’s loader). That loader uses Jiti v1 internally.
  • The demo repo uses a config like tailwind.config.mjs that re‑exports @repo/tailwind-config. That package uses Jiti v2 (via unbuild’s stub), creating the v1/v2 Proxy recursion.

Why the fix works

  • The fix bypasses Tailwind v3’s loadConfig and uses our own Jiti v2 import (createJiti(...).import(...)) directly. That eliminates the mixed Jiti versions and prevents the Proxy recursion, so the config loads normally.

All done by Codex. I improved it a little and confirmed that this works well in the reproduction in #431.

@Dunqing Dunqing force-pushed the fix/issue-431-jiti branch from d8ae771 to 1dda800 Compare February 5, 2026 14:58
@Dunqing Dunqing force-pushed the fix/issue-431-jiti branch from 1dda800 to 5f1f9aa Compare February 5, 2026 14:59
@Dunqing
Copy link
Copy Markdown
Contributor Author

Dunqing commented Feb 5, 2026

@Boshen cc

@Boshen Boshen merged commit 125a8bc into tailwindlabs:main Feb 5, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Maximum call stack size exceeded

2 participants