Skip to content

Fix max tokens tool calls function#1023

Open
carlows wants to merge 3 commits intopatterns-ai-core:mainfrom
carlows:fix/max_tokens
Open

Fix max tokens tool calls function#1023
carlows wants to merge 3 commits intopatterns-ai-core:mainfrom
carlows:fix/max_tokens

Conversation

@carlows
Copy link
Copy Markdown

@carlows carlows commented Aug 21, 2025

When you use the max_tokens option in google gemini, you get a response that looks like this once the limit is reached:

{
  "candidates": [
    {
      "content": {
        "role": "model"
      },
      "finishReason": "MAX_TOKENS",
      "index": 0
    }
  ],
  "usageMetadata": {
    ...
  },
  "modelVersion": "gemini-2.5-flash-lite",
  "responseId": "123"
}

Then the .tool_calls function fails because nil doesn't have a has_key? method.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant