Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
85 commits
Select commit Hold shift + click to select a range
fe8393a
Documentation and docstring graph and data
FilippoOlivo Mar 10, 2025
613dc54
Doc LabelTensor
FilippoOlivo Mar 11, 2025
33d9f15
Black formatting on LabelTensor
FilippoOlivo Mar 11, 2025
e9082df
Doc conditions
FilippoOlivo Mar 11, 2025
4dd00cd
Small fixes in conditions
FilippoOlivo Mar 11, 2025
37bb2af
Black formatting on condition
FilippoOlivo Mar 11, 2025
166493c
Codacy fix on condition
FilippoOlivo Mar 11, 2025
7a0c43b
Doc data
FilippoOlivo Mar 11, 2025
1dd7fef
Update doc condition
FilippoOlivo Mar 12, 2025
c4bdf03
Update doc data
FilippoOlivo Mar 12, 2025
8d4fee4
Update doc LT
FilippoOlivo Mar 12, 2025
2030004
Black formatting
FilippoOlivo Mar 12, 2025
85aa9eb
Doc collector
FilippoOlivo Mar 12, 2025
26ad3e6
Black formatting collector
FilippoOlivo Mar 12, 2025
a1b651a
Minor update
FilippoOlivo Mar 12, 2025
3b16bfe
beginning of domain doc
GiovanniCanali Mar 12, 2025
1d86d7a
Fix doc condition
FilippoOlivo Mar 12, 2025
8e2febf
Fix doc data
FilippoOlivo Mar 12, 2025
374c84b
Fix doc
FilippoOlivo Mar 12, 2025
1fa7c85
Other fixes
FilippoOlivo Mar 12, 2025
8b772ec
Black formatting
FilippoOlivo Mar 12, 2025
fa74493
fix doc domain
GiovanniCanali Mar 12, 2025
d559190
fix equation doc
GiovanniCanali Mar 12, 2025
e7c8710
Fix codacy
FilippoOlivo Mar 12, 2025
04ef78e
fix doc loss and codacy
GiovanniCanali Mar 12, 2025
41142c4
Update collector.py
FilippoOlivo Mar 12, 2025
3da693c
Update collector.py
FilippoOlivo Mar 12, 2025
0181851
fix operator doc
GiovanniCanali Mar 12, 2025
d6d778d
Additional fix in condition
FilippoOlivo Mar 12, 2025
7b09bf1
Additional fix in collector
FilippoOlivo Mar 12, 2025
593ab6b
Black formatting
FilippoOlivo Mar 12, 2025
3374176
fix utils and trainer doc
GiovanniCanali Mar 13, 2025
3f630d8
fix optim doc
GiovanniCanali Mar 13, 2025
02067fa
fix problem doc
GiovanniCanali Mar 13, 2025
d7debb2
start refactoring
dario-coscia Mar 13, 2025
0f6e6d7
adaptive_functions rst
dario-coscia Mar 13, 2025
78c319a
update rsts
dario-coscia Mar 13, 2025
240cbce
Fix doc data
FilippoOlivo Mar 12, 2025
c2f966e
Improve doc condition
FilippoOlivo Mar 13, 2025
9fba5a2
update rst
dario-coscia Mar 13, 2025
4583443
fix pinn doc
GiovanniCanali Mar 13, 2025
33abb66
updating rst
dario-coscia Mar 13, 2025
cba9f90
fix doc solver
GiovanniCanali Mar 13, 2025
e0fbd3d
fix rendering part 1
GiovanniCanali Mar 13, 2025
03ec91d
fix rendering part 2
GiovanniCanali Mar 13, 2025
488af56
black formatter
GiovanniCanali Mar 13, 2025
89f8e4e
Tmp fixes
FilippoOlivo Mar 14, 2025
9d9a01b
Fix rendering graph
FilippoOlivo Mar 14, 2025
974cbe4
Fix rendering LT
FilippoOlivo Mar 14, 2025
ada4f53
Black formatting
FilippoOlivo Mar 14, 2025
2edf4ea
fix doc model part 1
GiovanniCanali Mar 14, 2025
53be672
Fix conditions rendering
FilippoOlivo Mar 14, 2025
e102537
Fix rendering and codacy
FilippoOlivo Mar 14, 2025
e44da57
fix doc model part 2
GiovanniCanali Mar 14, 2025
1bbfa2e
modify poisson inv
dario-coscia Mar 14, 2025
da98d61
standardize module docstring
GiovanniCanali Mar 14, 2025
fc8ad33
Automatize Tutorials html, py files creation (#496)
dario-coscia Mar 15, 2025
f85d906
Remove MNIST data from tutorial 4
FilippoOlivo Mar 15, 2025
d05a27f
update workflow, rm tutorial 4 data
dario-coscia Mar 15, 2025
cb68ae5
update html dir
dario-coscia Mar 15, 2025
93b88a6
remove tutorials html
dario-coscia Mar 15, 2025
eb78831
add workflow_dispatch
dario-coscia Mar 15, 2025
c161138
trigger workflow tutorials
dario-coscia Mar 15, 2025
4330c9e
export tutorials changed in c161138 (#498)
github-actions[bot] Mar 15, 2025
cddac85
update doc
dario-coscia Mar 17, 2025
d7816cc
formatting
dario-coscia Mar 17, 2025
77e1e6a
modify automatic batching doc
dario-coscia Mar 17, 2025
d6f69a0
update contributing
dario-coscia Mar 17, 2025
b203353
update cite/team
dario-coscia Mar 17, 2025
0ea6888
format code
dario-coscia Mar 17, 2025
12fc1e5
adding layout.html template
dario-coscia Mar 17, 2025
c5bd22a
update datamodule doc
dario-coscia Mar 17, 2025
762dc7f
update tut11
dario-coscia Mar 17, 2025
923ec3c
export tutorials changed in 762dc7f (#500)
github-actions[bot] Mar 17, 2025
2099e87
formatting
dario-coscia Mar 17, 2025
e99ad86
doc test workflow update
dario-coscia Mar 17, 2025
a5b8844
docs->doc in testing_doc.yml
dario-coscia Mar 17, 2025
35aff03
automodule loss + update doc workflow
dario-coscia Mar 17, 2025
6d08b98
modify conf.py
dario-coscia Mar 17, 2025
d9f7ffc
update workflow file
dario-coscia Mar 17, 2025
9e1e4b9
modify doc workflows
dario-coscia Mar 17, 2025
d35daae
update versioning sphinx
dario-coscia Mar 17, 2025
9e90164
Add docstring for repeat in DataModule
FilippoOlivo Mar 17, 2025
fd070e5
Minor fix
FilippoOlivo Mar 17, 2025
40ad5c9
Remove useless line in sphinx config file
FilippoOlivo Mar 18, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 14 additions & 4 deletions pina/data/data_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -283,10 +283,20 @@ def __init__(
Default is ``None``.
:param bool shuffle: Whether to shuffle the dataset before splitting.
Default ``True``.
:param bool repeat: Whether to repeat the dataset indefinitely.
Default ``False``.
:param automatic_batching: Whether to enable automatic batching.
Default ``False``.
:param bool repeat: If ``True``, in case of batch size larger than the
number of elements in a specific condition, the elements are
repeated until the batch size is reached. If ``False``, the number
of elements in the batch is the minimum between the batch size and
the number of elements in the condition. Default is ``False``.
:param automatic_batching: If ``True``, automatic PyTorch batching
is performed, which consists of extracting one element at a time
from the dataset and collating them into a batch. This is useful
when the dataset is too large to fit into memory. On the other hand,
if ``False``, the items are retrieved from the dataset all at once
avoind the overhead of collating them into a batch and reducing the
__getitem__ calls to the dataset. This is useful when the dataset
fits into memory. Avoid using automatic batching when ``batch_size``
is large. Default is ``False``.
:param int num_workers: Number of worker threads for data loading.
Default ``0`` (serial loading).
:param bool pin_memory: Whether to use pinned memory for faster data
Expand Down
54 changes: 37 additions & 17 deletions pina/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ def __init__(
test_size=0.0,
val_size=0.0,
compile=None,
repeat=False,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe set it to None and override inside?

automatic_batching=None,
num_workers=None,
pin_memory=None,
Expand All @@ -49,9 +50,13 @@ def __init__(
validation dataset. Default is ``0.0``.
:param bool compile: If ``True``, the model is compiled before training.
Default is ``False``. For Windows users, it is always disabled.
:param bool repeat: Whether to repeat the dataset data in each
condition during training. For further details, see the
:class:`~pina.data.PinaDataModule` class. Default is ``False``.
:param bool automatic_batching: If ``True``, automatic PyTorch batching
is performed. Avoid using automatic batching when ``batch_size`` is
large. Default is ``False``.
is performed, otherwise the items are retrieved from the dataset
all at once. For further details, see the
:class:`~pina.data.PinaDataModule` class. Default is ``False``.
:param int num_workers: The number of worker threads for data loading.
Default is ``0`` (serial loading).
:param bool pin_memory: Whether to use pinned memory for faster data
Expand All @@ -65,12 +70,13 @@ def __init__(
"""
# check consistency for init types
self._check_input_consistency(
solver,
train_size,
test_size,
val_size,
automatic_batching,
compile,
solver=solver,
train_size=train_size,
test_size=test_size,
val_size=val_size,
repeat=repeat,
automatic_batching=automatic_batching,
compile=compile,
)
pin_memory, num_workers, shuffle, batch_size = (
self._check_consistency_and_set_defaults(
Expand Down Expand Up @@ -110,14 +116,15 @@ def __init__(
self._move_to_device()
self.data_module = None
self._create_datamodule(
train_size,
test_size,
val_size,
batch_size,
automatic_batching,
pin_memory,
num_workers,
shuffle,
train_size=train_size,
test_size=test_size,
val_size=val_size,
batch_size=batch_size,
repeat=repeat,
automatic_batching=automatic_batching,
pin_memory=pin_memory,
num_workers=num_workers,
shuffle=shuffle,
)

# logging
Expand Down Expand Up @@ -151,6 +158,7 @@ def _create_datamodule(
test_size,
val_size,
batch_size,
repeat,
automatic_batching,
pin_memory,
num_workers,
Expand All @@ -169,6 +177,8 @@ def _create_datamodule(
:param float val_size: The percentage of elements to include in the
validation dataset.
:param int batch_size: The number of samples per batch to load.
:param bool repeat: Whether to repeat the dataset data in each
condition during training.
:param bool automatic_batching: Whether to perform automatic batching
with PyTorch. If ``True``, automatic PyTorch batching
is performed, which consists of extracting one element at a time
Expand Down Expand Up @@ -206,6 +216,7 @@ def _create_datamodule(
test_size=test_size,
val_size=val_size,
batch_size=batch_size,
repeat=repeat,
automatic_batching=automatic_batching,
num_workers=num_workers,
pin_memory=pin_memory,
Expand Down Expand Up @@ -253,7 +264,13 @@ def solver(self, solver):

@staticmethod
def _check_input_consistency(
solver, train_size, test_size, val_size, automatic_batching, compile
solver,
train_size,
test_size,
val_size,
repeat,
automatic_batching,
compile,
):
"""
Verifies the consistency of the parameters for the solver configuration.
Expand All @@ -265,6 +282,8 @@ def _check_input_consistency(
test dataset.
:param float val_size: The percentage of elements to include in the
validation dataset.
:param bool repeat: Whether to repeat the dataset data in each
condition during training.
:param bool automatic_batching: Whether to perform automatic batching
with PyTorch.
:param bool compile: If ``True``, the model is compiled before training.
Expand All @@ -274,6 +293,7 @@ def _check_input_consistency(
check_consistency(train_size, float)
check_consistency(test_size, float)
check_consistency(val_size, float)
check_consistency(repeat, bool)
if automatic_batching is not None:
check_consistency(automatic_batching, bool)
if compile is not None:
Expand Down