Fix hash for chemprop#248
Open
c-w-feldmann wants to merge 17 commits into
Open
Conversation
Collaborator
soulios-basf
left a comment
There was a problem hiding this comment.
- Do we care about devices? this would place it always in cpu. we could also enforce devices if available.
- I am not 100% sure if the dtypes through the json and maybe reloaded as float64, so could you add an assertion in the test
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
In order to to use joblib.Memory in a Pipeline for more efficient hparam searches, the Hash of the Neural Fingerprint must be constant for the same weights. This is not given for torch.Tensors, requiring to serialize the state before hashing. Even though, a simple clone does not result in the same hash, as each model is initialized with random weights. Hence, for testing the state_dict_ref is providet to ensure that both models are initalized with the same weitghts.