Skip to content

fix: VLMModelWrapper PEFT isinstance compatibility for TRL validation#252

Merged
abrichr merged 1 commit into
mainfrom
fix/vlm-wrapper-peft-delegation
Mar 29, 2026
Merged

fix: VLMModelWrapper PEFT isinstance compatibility for TRL validation#252
abrichr merged 1 commit into
mainfrom
fix/vlm-wrapper-peft-delegation

Conversation

@abrichr
Copy link
Copy Markdown
Member

@abrichr abrichr commented Mar 29, 2026

Summary

TRL's validate_quantization_for_training() uses isinstance(model, PeftModel) to check for adapters. The VLMModelWrapper hid the PeftModel, causing: "You cannot perform fine-tuning on purely quantized models."

Fix: Dynamically create a combined class inheriting from both VLMModelWrapper and the wrapped model's type. isinstance() passes, our forward/generate/cache methods take priority via MRO.

Tests added

  • test_peft_attributes_delegated — peft_config accessible
  • test_hasattr_peft_config — hasattr() works for TRL
  • test_isinstance_peft_model — isinstance(wrapper, PeftModel) == True
  • test_wrapper_passes_peft_validation (e2e) — simulates TRL validation
  • test_wrapper_preserves_trainable_parameters (e2e) — optimizer setup works

Test plan

  • 1515 passed in full suite
  • 12 wrapper unit tests pass, 1 skipped (peft not installed)
  • Client GPU test — should pass TRL's validation

🤖 Generated with Claude Code

TRL's validate_quantization_for_training() uses isinstance(model,
PeftModel) to check for adapters. The wrapper hid the PeftModel,
causing: "You cannot perform fine-tuning on purely quantized models."

Fix: dynamically create a combined class inheriting from both
VLMModelWrapper and the wrapped model's type. This makes isinstance()
pass while keeping our forward/generate/cache methods via MRO.

Tests added:
- test_peft_attributes_delegated: peft_config accessible through wrapper
- test_hasattr_peft_config: hasattr() works for TRL's checks
- test_isinstance_peft_model: isinstance(wrapper, PeftModel) == True
- test_wrapper_passes_peft_validation (e2e): full TRL validation sim
- test_wrapper_preserves_trainable_parameters (e2e): optimizer setup

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@abrichr abrichr merged commit 7879dee into main Mar 29, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant