Basic checks
What's broken?
with_params(response_format: { type: 'json_object' }) does not work for Gemini models.
code:072> RubyLLM.chat(model: 'gemini-2.5-flash-lite').with_params(response_format: { type: 'json_object' }).ask('Testing. Respond with an json object with success = true')
code:72:in '<main>': Invalid JSON payload received. Unknown name "response_format": Cannot find field. (RubyLLM::BadRequestError)
However, setting the response mime type in the generator configuration works:
https://ai.google.dev/gemini-api/docs/structured-output?example=recipe#json_schema_support
chat:075> RubyLLM.chat(model: 'gemini-2.5-flash-lite').with_params(generationConfig: { responseMimeType: 'application/json' }).ask('Testing. Respond with a valid JSON with success = true')
=>
#<RubyLLM::Message:0x00000001299f3628
@content=#<RubyLLM::Content:0x0000000128daa420 @attachments=[], @text="{\n \"success\": true\n}">,
@model_id="gemini-2.5-flash-lite",
@role=:assistant,
@thinking=nil,
@tokens=#<RubyLLM::Tokens:0x00000001299f3538 @cache_creation=nil, @cached=nil, @input=12, @output=9, @thinking=nil>,
@tool_call_id=nil,
@tool_calls=nil>
How to reproduce
RubyLLM.chat(model: 'gemini-2.5-flash-lite').with_params(response_format: { type: 'json_object' }).ask('Testing. Respond with an json object with success = true')
- Errors
Expected behavior
If the gemini provider sees response_format: { type: 'json_object' }, it should either:
- Read this and update the provider specific configuration OR
- Raise an explicit error OR
- Clearly document in RubyLLM/docs that Gemini needs
generationConfig: { responseMimeType: ..., responseSchema: ... } not response_format.
What actually happened
It's not usable
Environment
- Ruby version: (any)
- RubyLLM version: (latest)
- Provider: gemini
- OS: (any)
Basic checks
What's broken?
with_params(response_format: { type: 'json_object' })does not work for Gemini models.However, setting the response mime type in the generator configuration works:
https://ai.google.dev/gemini-api/docs/structured-output?example=recipe#json_schema_support
How to reproduce
RubyLLM.chat(model: 'gemini-2.5-flash-lite').with_params(response_format: { type: 'json_object' }).ask('Testing. Respond with an json object with success = true')Expected behavior
If the gemini provider sees
response_format: { type: 'json_object' }, it should either:generationConfig: { responseMimeType: ..., responseSchema: ... }notresponse_format.What actually happened
It's not usable
Environment