Skip to content

Problem in backward #7490

@haoweiz23

Description

@haoweiz23

Describe the bug

Hi, I modified the U-Net forward function to use the "wrapped_forward" function. However, I encountered an issue when attempting to backpropagate the loss. I adapted the training code from "examples/train_dreambooth.py". The error log is provided below. It appears to be related to gradient checkpointing, but I couldn't find any useful information to resolve it.
image

Reproduction

image

Logs

No response

System Info

Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.

  • diffusers version: 0.28.0.dev0
  • Platform: Linux-5.4.0-148-generic-x86_64-with-glibc2.31
  • Python version: 3.10.6
  • PyTorch version (GPU?): 1.13.1+cu117 (True)
  • Huggingface_hub version: 0.20.2
  • Transformers version: 4.39.0
  • Accelerate version: 0.25.0
  • xFormers version: not installed
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Who can help?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions