Skip to content

FIX Transformers weight conversion regression#3197

Merged
BenjaminBossan merged 2 commits intohuggingface:mainfrom
BenjaminBossan:fix-weight-conversion-transformers-45448
Apr 29, 2026
Merged

FIX Transformers weight conversion regression#3197
BenjaminBossan merged 2 commits intohuggingface:mainfrom
BenjaminBossan:fix-weight-conversion-transformers-45448

Conversation

@BenjaminBossan
Copy link
Copy Markdown
Member

@BenjaminBossan BenjaminBossan commented Apr 24, 2026

After a change in huggingface/transformers#45448, weight conversion tests started failing. Transformers provided a fix in huggingface/transformers#45622 but it needs to be ported to PEFT too. The PR also ports the fix from huggingface/transformers#45428.

This PR fixes the issue from the PEFT side. For the transformers side (model.load_adapter), we need to wait for huggingface/transformers#45622 to land and be released.

After a change in
huggingface/transformers#45448, weight
conversion tests started failing. Transformers provided a fix in
huggingface/transformers#45622 but it needs to
be ported to PEFT too.

This PR, together with the Transformers fix, resolves the issue.
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan
Copy link
Copy Markdown
Member Author

@Cyrilvallez please review.

Copy link
Copy Markdown
Member

@Cyrilvallez Cyrilvallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, it matches the transformers's PR!

Comment on lines +255 to +258
for pat in list(orig_conversion.source_patterns):
for pat in list(orig_conversion._original_source_patterns):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If orig_conversion stems from transformers then I think it might be a breaking change? Because if someone is using an older transformers version, then it could be problematic? If not, then it's okay.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Cyrilvallez Would this change work with older Transformers versions?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was added in huggingface/transformers#45340, so indeed will not be fully compatible with all previous versions - for your use case you could simply use source_patterns/target_patterns instead of their counterpart _original_xxx to avoid checking versions

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, I reverted the argument names back. I tested locally with transformers v5.4.0, v5.5.4, and the latest main, and the test passed.

What I wonder about: If we can use source_patterns/target_patterns, why the change to _original_xxx in the first place?

@BenjaminBossan
Copy link
Copy Markdown
Member Author

Tests are now passing (except the ones from Transformers side, which require a new Transformers release).

@BenjaminBossan
Copy link
Copy Markdown
Member Author

Transformers release is out, so we just need this PR for tests to pass again. I'll go ahead and merge it.

@BenjaminBossan BenjaminBossan merged commit 2bf7bc2 into huggingface:main Apr 29, 2026
2 of 10 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-weight-conversion-transformers-45448 branch April 29, 2026 12:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants