Summary
The Braintrust LiteLLM integration instruments litellm.moderation() (sync) but not litellm.amoderation() (async). This is the only LiteLLM function where the async variant is missing — every other instrumented function has both sync and async patchers.
LiteLLM exposes amoderation() as a documented async function (confirmed via LiteLLM source and recent bug fix PR). Calls to litellm.amoderation() produce zero Braintrust tracing.
What is missing
No tracing span is created when users call litellm.amoderation() through either wrap_litellm() or patch_litellm().
Current async parity in py/src/braintrust/integrations/litellm/patchers.py:
| Function |
Sync Patcher |
Async Patcher |
completion / acompletion |
Yes |
Yes |
responses / aresponses |
Yes |
Yes |
image_generation / aimage_generation |
Yes |
Yes |
embedding / aembedding |
Yes |
Yes |
speech / aspeech |
Yes |
Yes |
transcription / atranscription |
Yes |
Yes |
rerank / arerank |
Yes |
Yes |
moderation / amoderation |
Yes |
No |
The fix requires:
- Adding
_amoderation_wrapper_async to tracing.py (mirroring the existing _moderation_wrapper)
- Adding
LiteLLMAmoderationPatcher to patchers.py
- Adding it to
_ALL_LITELLM_PATCHERS
Braintrust docs status
The Braintrust LiteLLM integration docs do not specifically mention moderation support.
Upstream sources
Local files inspected
py/src/braintrust/integrations/litellm/patchers.py — LiteLLMModerationPatcher exists (sync only); no LiteLLMAmoderationPatcher
py/src/braintrust/integrations/litellm/tracing.py — _moderation_wrapper exists; no _amoderation_wrapper_async
py/src/braintrust/integrations/litellm/test_litellm.py — no async moderation tests
- Grep for
amoderation across py/src/braintrust/integrations/litellm/ returns zero matches
Relationship to existing issues
Summary
The Braintrust LiteLLM integration instruments
litellm.moderation()(sync) but notlitellm.amoderation()(async). This is the only LiteLLM function where the async variant is missing — every other instrumented function has both sync and async patchers.LiteLLM exposes
amoderation()as a documented async function (confirmed via LiteLLM source and recent bug fix PR). Calls tolitellm.amoderation()produce zero Braintrust tracing.What is missing
No tracing span is created when users call
litellm.amoderation()through eitherwrap_litellm()orpatch_litellm().Current async parity in
py/src/braintrust/integrations/litellm/patchers.py:completion/acompletionresponses/aresponsesimage_generation/aimage_generationembedding/aembeddingspeech/aspeechtranscription/atranscriptionrerank/arerankmoderation/amoderationThe fix requires:
_amoderation_wrapper_asynctotracing.py(mirroring the existing_moderation_wrapper)LiteLLMAmoderationPatchertopatchers.py_ALL_LITELLM_PATCHERSBraintrust docs status
The Braintrust LiteLLM integration docs do not specifically mention moderation support.
Upstream sources
amoderation()confirmed via: Fix: Moderations endpoint now respectsapi_baseconfiguration parameter BerriAI/litellm#16087Local files inspected
py/src/braintrust/integrations/litellm/patchers.py—LiteLLMModerationPatcherexists (sync only); noLiteLLMAmoderationPatcherpy/src/braintrust/integrations/litellm/tracing.py—_moderation_wrapperexists; no_amoderation_wrapper_asyncpy/src/braintrust/integrations/litellm/test_litellm.py— no async moderation testsamoderationacrosspy/src/braintrust/integrations/litellm/returns zero matchesRelationship to existing issues
patch_litellm()does not patchembeddingormoderation;aembeddingmissing entirely #115 (closed) originally notedamoderationas missing alongsideaembedding. Theaembeddinggap was fixed (patcher now exists), butamoderationwas not added during the integration refactor frompy/src/braintrust/wrappers/topy/src/braintrust/integrations/.text_completion()/atext_completion()not instrumented #401 tracks the separatetext_completion/atext_completiongap.