Skip to content

Commit 25dcf41

Browse files
test(data-plane): move data_plane unit tests under tests/unit/ for CI discovery
tests/unit/L0_Unit_Tests_*.sh hard-code TEST_PATHS=('unit/'), so any test outside tests/unit/ is silently skipped. Our data_plane suite lived at tests/data_plane/unit/ and was never collected by CI. Move all 19 unit tests + conftest + __init__ into tests/unit/data_plane/ via git mv (one was untracked, so plain mv). The tests/data_plane/functional/ tree stays where it is — those are Tier 2 (Ray + TQ), need a separate runner. Plus three drive-by fixes flagged by CI lint: - nemo_rl/data_plane/docs/data_plane_api_lifecycle.md → data-plane-api-lifecycle.md (new pre-commit hook disallows underscores in .md filenames). - nemo_rl/data_plane/column_io.py:158: change variable annotation Mapping -> dict so pyrefly accepts the dict-comp result at the pack_jagged_fields call site (the function signature is dict[str, ...]). Mapping import drops as unused; ruff auto-fixes. - nemo_rl/data_plane/worker_mixin.py:224,249: pyrefly no-matching-overload on list(meta.fields) where meta.fields: list[str] | None. Use # type: ignore[no-matching-overload] rather than list(meta.fields or []) — the runtime contract guarantees meta.fields is non-None at these call sites; silently substituting [] would mean fetch-nothing which is wrong. - nemo_rl/experience/sync_rollout_actor.py: add 'import torch' (F821 — module used torch.zeros_like and isinstance(v, torch.Tensor) without importing torch). Signed-off-by: Zhiyu Li <zhiyul@NVIDIA.com>
1 parent eecbcc4 commit 25dcf41

23 files changed

Lines changed: 161 additions & 4 deletions

nemo_rl/data_plane/column_io.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
:class:`KVBatchMeta`.
3030
"""
3131

32-
from typing import Any, Mapping, Sequence
32+
from typing import Any, Sequence
3333

3434
import numpy as np
3535
import torch
@@ -155,7 +155,7 @@ def kv_first_write(
155155
f"kv_first_write: keys ({len(keys)}) must match batch size ({n})"
156156
)
157157
lengths = final_batch_cpu["input_lengths"]
158-
fields: Mapping[str, torch.Tensor | np.ndarray] = {
158+
fields: dict[str, torch.Tensor | np.ndarray] = {
159159
k: v
160160
for k, v in final_batch_cpu.items()
161161
if isinstance(v, torch.Tensor)
File renamed without changes.

nemo_rl/data_plane/worker_mixin.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -221,7 +221,7 @@ def _fetch(
221221
td = self._require_dp_client().kv_batch_get(
222222
keys=meta.keys,
223223
partition_id=meta.partition_id,
224-
select_fields=list(meta.fields),
224+
select_fields=list(meta.fields), # type: ignore[no-matching-overload]
225225
)
226226
data = materialize(
227227
td,
@@ -246,7 +246,7 @@ def _fetch(
246246
td = self._require_dp_client().kv_batch_get(
247247
keys=meta.keys,
248248
partition_id=meta.partition_id,
249-
select_fields=list(meta.fields),
249+
select_fields=list(meta.fields), # type: ignore[no-matching-overload]
250250
)
251251
data = materialize(
252252
td,

nemo_rl/experience/sync_rollout_actor.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,7 @@
4040

4141
import numpy as np
4242
import ray
43+
import torch
4344

4445
from nemo_rl.data_plane.column_io import kv_first_write
4546
from nemo_rl.data_plane.interfaces import KVBatchMeta
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 commit comments

Comments
 (0)