Skip to content

Fix load_lora_weights crash on modular sub-pipelines without transformer#13495

Open
ParamChordiya wants to merge 3 commits intohuggingface:mainfrom
ParamChordiya:fix/modular-pipeline-lora-missing-transformer
Open

Fix load_lora_weights crash on modular sub-pipelines without transformer#13495
ParamChordiya wants to merge 3 commits intohuggingface:mainfrom
ParamChordiya:fix/modular-pipeline-lora-missing-transformer

Conversation

@ParamChordiya
Copy link
Copy Markdown

Summary

Fixes #13487

  • load_lora_weights crashes with AttributeError: 'Flux2ModularPipeline' object has no attribute 'transformer' when called on a modular sub-pipeline that doesn't have the transformer component (e.g., a text-encoder-only sub-pipeline)
  • Root cause: load_lora_weights unconditionally accesses self.transformer via unsafe getattr/hasattr patterns
  • Fix: use safe attribute access (getattr with default None) and skip loading with a warning when the target component is not available

Changes

Flux2LoraLoaderMixin.load_lora_weights: checks if transformer exists; warns and returns early if missing.

FluxLoraLoaderMixin.load_lora_weights: independently guards transformer and text_encoder loading, so a sub-pipeline with only one of them still works. Only returns early if both are absent.

Test: added test_load_lora_weights_warns_when_transformer_missing to verify the fix.

Before

text_pipe.load_lora_weights("flux.2-turbo-lora.safetensors")
# AttributeError: 'Flux2ModularPipeline' object has no attribute 'transformer'

After

text_pipe.load_lora_weights("flux.2-turbo-lora.safetensors")
# WARNING: The `transformer` component is not available in this pipeline. Skipping LoRA weight loading.

Test plan

  • Verify load_lora_weights on a full Flux2Pipeline still works normally
  • Verify load_lora_weights on a text-encoder-only modular sub-pipeline warns and skips without crashing
  • Verify load_lora_weights on a denoise-only modular sub-pipeline loads transformer LoRA correctly
  • Run existing LoRA tests: pytest tests/lora/test_lora_layers_flux2.py

When calling `load_lora_weights` on a modular sub-pipeline that only
contains certain components (e.g., text encoders), the method crashes
with `AttributeError` because it unconditionally accesses
`self.transformer`.

This change uses safe attribute access (`getattr` with default `None`)
and skips loading with a warning when the target component is not
available on the pipeline.

Fixes both `Flux2LoraLoaderMixin` and `FluxLoraLoaderMixin`:
- Flux2: warns and returns early if transformer is missing
- Flux1: independently skips transformer/text_encoder loading when
  either is missing, only returns early if both are absent

Fixes huggingface#13487
@github-actions github-actions Bot added lora tests size/M PR with diff < 200 LOC labels Apr 17, 2026
@github-actions github-actions Bot added size/M PR with diff < 200 LOC and removed size/M PR with diff < 200 LOC labels Apr 21, 2026
Copy link
Copy Markdown
Collaborator

@yiyixuxu yiyixuxu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for the PR!
i left one comment

Comment thread src/diffusers/loaders/lora_pipeline.py
…coder handling

Per reviewer feedback, Flux LoRA does not touch text_encoder so the existing
assumption should be preserved. Only guard against missing transformer:
warn and return early if transformer is None, then proceed with original logic.
@github-actions github-actions Bot added fixes-issue size/S PR with diff < 50 LOC and removed size/M PR with diff < 200 LOC labels Apr 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Loading loras on "text_pipe" of a modular pipeline fails

2 participants