Skip to content

Commit e9113ba

Browse files
committed
fix(models): pass NOT_GIVEN instead of None for system instruction in AnthropicLlm
When no system instruction is set (e.g. during event compaction), system_instruction is None. The Anthropic API rejects None — it expects a str or list of content blocks. This causes a 400 Bad Request when compaction fires via LlmEventSummarizer. Pass NOT_GIVEN (already imported) when system_instruction is None/empty so the parameter is omitted from the API call. Fixes both streaming and non-streaming code paths. Fixes #5318
1 parent 454188d commit e9113ba

1 file changed

Lines changed: 13 additions & 2 deletions

File tree

src/google/adk/models/anthropic_llm.py

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -402,10 +402,16 @@ async def generate_content_async(
402402
else NOT_GIVEN
403403
)
404404

405+
system = (
406+
llm_request.config.system_instruction
407+
if llm_request.config and llm_request.config.system_instruction
408+
else NOT_GIVEN
409+
)
410+
405411
if not stream:
406412
message = await self._anthropic_client.messages.create(
407413
model=model_to_use,
408-
system=llm_request.config.system_instruction,
414+
system=system,
409415
messages=messages,
410416
tools=tools,
411417
tool_choice=tool_choice,
@@ -431,9 +437,14 @@ async def _generate_content_streaming(
431437
a final aggregated LlmResponse with all content.
432438
"""
433439
model_to_use = self._resolve_model_name(llm_request.model)
440+
system = (
441+
llm_request.config.system_instruction
442+
if llm_request.config and llm_request.config.system_instruction
443+
else NOT_GIVEN
444+
)
434445
raw_stream = await self._anthropic_client.messages.create(
435446
model=model_to_use,
436-
system=llm_request.config.system_instruction,
447+
system=system,
437448
messages=messages,
438449
tools=tools,
439450
tool_choice=tool_choice,

0 commit comments

Comments
 (0)