Skip to content

Fix hash for chemprop#248

Open
c-w-feldmann wants to merge 17 commits into
mainfrom
fix-hash-for-chemprop
Open

Fix hash for chemprop#248
c-w-feldmann wants to merge 17 commits into
mainfrom
fix-hash-for-chemprop

Conversation

@c-w-feldmann
Copy link
Copy Markdown
Collaborator

In order to to use joblib.Memory in a Pipeline for more efficient hparam searches, the Hash of the Neural Fingerprint must be constant for the same weights. This is not given for torch.Tensors, requiring to serialize the state before hashing. Even though, a simple clone does not result in the same hash, as each model is initialized with random weights. Hence, for testing the state_dict_ref is providet to ensure that both models are initalized with the same weitghts.

@soulios-basf soulios-basf self-requested a review February 20, 2026 13:55
Comment thread molpipeline/estimators/chemprop/abstract.py Outdated
Copy link
Copy Markdown
Collaborator

@soulios-basf soulios-basf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. Do we care about devices? this would place it always in cpu. we could also enforce devices if available.
  2. I am not 100% sure if the dtypes through the json and maybe reloaded as float64, so could you add an assertion in the test

Copy link
Copy Markdown
Collaborator

@soulios-basf soulios-basf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants