Skip to content

Repair the loss parameter#37

Open
hwmaybe wants to merge 1 commit intorafellerc:masterfrom
hwmaybe:loss_repair
Open

Repair the loss parameter#37
hwmaybe wants to merge 1 commit intorafellerc:masterfrom
hwmaybe:loss_repair

Conversation

@hwmaybe
Copy link
Copy Markdown

@hwmaybe hwmaybe commented Oct 27, 2020

The choices for the parameter reduction of the function torch.nn.functional.binary_cross_entropy_with_logits in pytorch 0.4.1 are 'none'|‘elementwise_mean’ | ‘sum’. But the choice in the master code is 'mean'. It will cause ValueError: Input contains NaN, infinity or a value too large for dtype('float32').

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant