@@ -349,6 +349,76 @@ async with Client("http://localhost:8080", token="secret") as client:
349349| ` max_concurrent_activity_tasks ` | ` 10 ` | Max parallel activity tasks |
350350| ` shutdown_timeout ` | ` 30.0 ` | Seconds to drain in-flight tasks on stop |
351351
352+ ## Logging
353+
354+ The SDK uses Python's standard ` logging ` module with structured logger names:
355+
356+ | Logger Name | What It Logs |
357+ | -------------| --------------|
358+ | ` durable_workflow.worker ` | Worker registration, task polls, task completion, errors |
359+ | ` durable_workflow.workflow.replay ` | Workflow replay events (silent during replay) |
360+
361+ ### Configuring Logging
362+
363+ Set the logging level in your application's entry point:
364+
365+ ``` python
366+ import logging
367+
368+ # Show INFO-level worker events (registration, task completion)
369+ logging.basicConfig(level = logging.INFO )
370+
371+ # Or configure specific loggers
372+ logging.getLogger(" durable_workflow.worker" ).setLevel(logging.DEBUG )
373+ logging.getLogger(" durable_workflow.workflow.replay" ).setLevel(logging.INFO )
374+ ```
375+
376+ ### Log Levels
377+
378+ - ** INFO** : Worker registration, task starts/completions, workflow completion
379+ - ** DEBUG** : Detailed task payloads (truncated), poll cycles
380+ - ** WARNING** : Retryable errors (failed API calls, unknown workflow types)
381+ - ** ERROR** : Non-retryable failures, replay crashes
382+
383+ ### Replay-Aware Logging
384+
385+ Inside workflows, use ` ctx.logger ` for replay-aware logging:
386+
387+ ``` python
388+ @workflow.defn (name = " order_processor" )
389+ class OrderProcessor :
390+ def run (self , ctx , order_id : str ):
391+ ctx.logger.info(" Processing order %s " , order_id) # Only logs during execution, not replay
392+ result = yield ctx.schedule_activity(" process_order" , [order_id])
393+ ctx.logger.info(" Order processed: %s " , result)
394+ return result
395+ ```
396+
397+ Log statements are ** silent during replay** to avoid duplicate log spam when workflows recover or continue execution.
398+
399+ ### Structured Logging
400+
401+ For JSON-structured logs, configure your application's root logger with a JSON formatter:
402+
403+ ``` python
404+ import logging
405+ import json
406+
407+ class JSONFormatter (logging .Formatter ):
408+ def format (self , record ):
409+ return json.dumps({
410+ " timestamp" : self .formatTime(record),
411+ " level" : record.levelname,
412+ " logger" : record.name,
413+ " message" : record.getMessage(),
414+ })
415+
416+ handler = logging.StreamHandler()
417+ handler.setFormatter(JSONFormatter())
418+ logging.getLogger(" durable_workflow" ).addHandler(handler)
419+ logging.getLogger(" durable_workflow" ).setLevel(logging.INFO )
420+ ```
421+
352422## Schedules
353423
354424Create and manage scheduled workflows through the client:
0 commit comments