Skip to content

Commit c329345

Browse files
dougborgclaude
andauthored
test(mcp): pin MO+datetime regression on typed-cache write path (#632) (#678)
#632 reported `list_manufacturing_orders` failing with `Object of type datetime is not JSON serializable` on every page. Root cause was a ManufacturingOrder row carrying a SerialNumber with a populated `transaction_date` — that datetime lives inside the `serial_numbers` PydanticJSON column on CachedManufacturingOrder, and SQLAlchemy's stock `json.dumps` rejected it. The underlying fix already landed in 1174c34 ("fix(client): PydanticJSON serializes nested datetimes in plain dicts/lists") for the parent #659 cluster, which routes JSON-column values through `pydantic_core.to_jsonable_python` and so handles nested datetimes uniformly. The CachedSupplier case already had regression coverage; this commit adds the parallel MO case so the fix can't silently regress only for MO-shape rows. The new test mirrors the production path exactly: 1. PydanticManufacturingOrder with a SerialNumber.transaction_date 2. `_convert`-style model_dump → CachedManufacturingOrder.model_validate 3. `_bulk_upsert`-style row.model_dump(include=column_names) 4. `sqlite_insert(...).values(values)` + read-back Closes #632. Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
1 parent 32b8e07 commit c329345

2 files changed

Lines changed: 109 additions & 2 deletions

File tree

katana_mcp_server/tests/test_typed_cache.py

Lines changed: 108 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -613,7 +613,9 @@ async def test_json_column_handles_nested_datetime_in_plain_dicts(
613613
SupplierAddress,
614614
)
615615

616-
now = datetime.now(tz=UTC)
616+
# Frozen literal so the test is hermetic — any future value-equality
617+
# check stays reproducible, and failure output is stable across runs.
618+
now = datetime(2026, 1, 15, 12, 0, 0, tzinfo=UTC)
617619
api_supplier = PydanticSupplier(
618620
id=42,
619621
name="Acme Supplies",
@@ -661,6 +663,111 @@ async def test_json_column_handles_nested_datetime_in_plain_dicts(
661663
assert isinstance(addr["created_at"], str)
662664
assert isinstance(addr["updated_at"], str)
663665

666+
@pytest.mark.asyncio
667+
async def test_mo_serial_numbers_datetime_round_trip_via_bulk_upsert(
668+
self, typed_cache_engine
669+
):
670+
"""Regression (#632): ``list_manufacturing_orders`` triggered
671+
``Object of type datetime is not JSON serializable`` during
672+
``_bulk_upsert`` whenever an MO in the fetched page carried a
673+
``SerialNumber`` with a populated ``transaction_date``.
674+
675+
``CachedManufacturingOrder.serial_numbers: list[SerialNumber] |
676+
None`` is a ``Column(PydanticJSON)`` column, and ``SerialNumber``
677+
has a ``transaction_date: AwareDatetime | None`` field. The full
678+
production path is:
679+
680+
1. ``_convert`` runs ``api_obj.model_dump()`` (mode="python") →
681+
the nested SerialNumber becomes a plain dict whose
682+
``transaction_date`` leaf is a live ``datetime``.
683+
2. ``cache_cls.model_validate(parent_data)`` reconstructs real
684+
``SerialNumber`` instances on the cache row (``Mapped`` is an
685+
identity shim at runtime, so the effective field type is
686+
``list[SerialNumber] | None`` and pydantic happily parses the
687+
dicts back). The ``transaction_date`` stays a live ``datetime``.
688+
3. ``_bulk_upsert`` calls ``row.model_dump(include=column_names)``
689+
(also mode="python"). The nested SerialNumbers are flattened
690+
back to dicts; ``transaction_date`` remains a live ``datetime``
691+
on the leaf. The list-of-dicts is then handed to
692+
``sqlite_insert(...).values(values)``.
693+
4. SQLAlchemy binds the ``serial_numbers`` value to the
694+
``PydanticJSON`` column, which (post #659 / commit 1174c34c)
695+
routes through ``to_jsonable_python`` and produces a JSON-safe
696+
value before the stock ``json.dumps`` runs.
697+
698+
The fix landed in the client; this test pins the MO case
699+
explicitly so the parallel CachedSupplier coverage above doesn't
700+
accidentally regress only for MO-shape rows. The two tests are
701+
kept separate rather than parametrized — distinct import paths
702+
and field shapes (SupplierAddress vs SerialNumber) make a shared
703+
fixture noisier than the duplication it would remove.
704+
"""
705+
from datetime import UTC, datetime
706+
707+
from sqlalchemy import inspect as sqla_inspect
708+
from sqlalchemy.dialects.sqlite import insert as sqlite_insert
709+
710+
from katana_public_api_client.models_pydantic._generated import (
711+
CachedManufacturingOrder,
712+
ManufacturingOrder as PydanticManufacturingOrder,
713+
ManufacturingOrderStatus,
714+
)
715+
from katana_public_api_client.models_pydantic._generated.stock import (
716+
SerialNumber,
717+
)
718+
719+
# Frozen literal so the test is hermetic. ``id`` and ``order_no``
720+
# carry the SO ref + issue number from the #632 crash report as a
721+
# breadcrumb back to the originating session.
722+
now = datetime(2026, 1, 15, 12, 0, 0, tzinfo=UTC)
723+
api_mo = PydanticManufacturingOrder(
724+
id=44256191,
725+
order_no="MFG-#632",
726+
status=ManufacturingOrderStatus.in_progress,
727+
order_created_date=now,
728+
serial_numbers=[
729+
SerialNumber(
730+
id=1,
731+
serial_number="SN-001",
732+
transaction_date=now,
733+
quantity_change=1,
734+
),
735+
],
736+
)
737+
738+
# Mirror ``_convert``: model_dump → model_validate on the cache class.
739+
cached = CachedManufacturingOrder.model_validate(api_mo.model_dump())
740+
741+
# Mirror ``_bulk_upsert``: include=column_names + dialect insert.
742+
# ``_bulk_upsert`` also appends ``on_conflict_do_update`` for the
743+
# upsert semantics, but the regression target is the bind-param
744+
# path that runs before the conflict clause is even evaluated, so
745+
# a plain INSERT is sufficient to exercise it.
746+
mapper = sqla_inspect(CachedManufacturingOrder)
747+
column_names = {col.name for col in mapper.columns}
748+
values = [cached.model_dump(include=column_names)]
749+
stmt = sqlite_insert(CachedManufacturingOrder).values(values)
750+
async with typed_cache_engine.session() as session:
751+
# Pre-fix: this raised ``TypeError: Object of type datetime is
752+
# not JSON serializable`` and killed the whole 30-row page.
753+
await session.exec(stmt)
754+
await session.commit()
755+
756+
async with typed_cache_engine.session() as session:
757+
stmt2 = select(CachedManufacturingOrder).where(
758+
CachedManufacturingOrder.id == 44256191
759+
)
760+
result = await session.exec(stmt2)
761+
fetched = result.one()
762+
assert fetched.serial_numbers is not None
763+
assert len(fetched.serial_numbers) == 1
764+
sn = fetched.serial_numbers[0]
765+
assert isinstance(sn, dict)
766+
assert sn["serial_number"] == "SN-001"
767+
# ``transaction_date`` survives the JSON round-trip as an
768+
# ISO-8601 string (json.loads doesn't reconstruct datetimes).
769+
assert isinstance(sn["transaction_date"], str)
770+
664771
def test_pydantic_json_serializes_plain_dict_with_datetime(self):
665772
"""Unit-level regression (#659): PydanticJSON's ``process_bind_param``
666773
must JSON-serialize plain dicts/lists whose leaves contain live

uv.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)