Skip to content

Commit ef2b0fb

Browse files
claudesaurabhbikram
authored andcommitted
fix: truncate excess scenarios instead of raising error
When the LLM generates more scenarios than requested, truncate to the requested count instead of raising a ValueError. This makes scenario generation more robust since LLMs occasionally overshoot the count. Still raises an error when too few scenarios are generated. https://claude.ai/code/session_017Y9KNNQX2RyVWnqpj3A4hh
1 parent e57ab66 commit ef2b0fb

File tree

1 file changed

+6
-1
lines changed

1 file changed

+6
-1
lines changed

src/art/mcp/generate_scenarios.py

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -201,9 +201,14 @@ async def generate_scenarios(
201201
scenarios = result if isinstance(result, list) else list(result.values())[0]
202202

203203
# Validate count
204-
if len(scenarios) != num_scenarios:
204+
if len(scenarios) < num_scenarios:
205205
err(f"Expected {num_scenarios} scenarios, got {len(scenarios)}.")
206206
raise ValueError(f"Expected {num_scenarios} scenarios, got {len(scenarios)}")
207+
elif len(scenarios) > num_scenarios:
208+
ok(
209+
f"Expected {num_scenarios} scenarios, got {len(scenarios)}. Truncating to {num_scenarios}."
210+
)
211+
scenarios = scenarios[:num_scenarios]
207212

208213
ok(f"Parsed {len(scenarios)} scenario(s) successfully.")
209214

0 commit comments

Comments
 (0)