Skip to content

Fix chemprop init with trainer#181

Open
JenniferHem wants to merge 6 commits intomainfrom
fix-chemprop-init-with-trainer
Open

Fix chemprop init with trainer#181
JenniferHem wants to merge 6 commits intomainfrom
fix-chemprop-init-with-trainer

Conversation

@JenniferHem
Copy link
Copy Markdown
Collaborator

Issue:

Currently a pl.Trainer object can be initialized via from lightning import pytorch as pl, but also via import pytorch_lightning as pl. We use methods to get and set params in Chemprop. Unfortunately a Trainer Object is newly initialized upon calling set or update params. However, at this time an Accelerator is already instatiated, which leads to an issue as lightning requires a string. The "get device" function detects via isinstance wether a CPU or GPU accelerator was chosen and will transform this back into a string. The isinstance Method however only works for the lighning import but fails if pytorch_lighning is used.

Solution:

To prevent adding pytorch_lighning as a dependency we now enhanced the validation. If a user does not use lightning (but used pytorch_lighning instead) a corresponding value Error will now be raised.

Comment thread pyproject.toml
Comment thread test_extras/test_chemprop/test_chemprop_pipeline.py
Comment thread molpipeline/estimators/chemprop/lightning_wrapper.py Outdated
Comment thread molpipeline/estimators/chemprop/lightning_wrapper.py Outdated
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants