Skip to content

Commit 29f9259

Browse files
committed
Fix torch_dtype warning showing wrong value in from_single_file
The warning message intended to show the invalid torch_dtype value passed by the user, but the variable was already reassigned to torch.float32 before the warning was logged. Move the reassignment after the warning so the message correctly displays the original invalid value.
1 parent b8aebf4 commit 29f9259

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

src/diffusers/loaders/single_file.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -366,10 +366,10 @@ def from_single_file(cls, pretrained_model_link_or_path, **kwargs) -> Self:
366366
is_legacy_loading = False
367367

368368
if torch_dtype is not None and not isinstance(torch_dtype, torch.dtype):
369-
torch_dtype = torch.float32
370369
logger.warning(
371370
f"Passed `torch_dtype` {torch_dtype} is not a `torch.dtype`. Defaulting to `torch.float32`."
372371
)
372+
torch_dtype = torch.float32
373373

374374
# We shouldn't allow configuring individual models components through a Pipeline creation method
375375
# These model kwargs should be deprecated

0 commit comments

Comments
 (0)