Skip to content

Commit f297288

Browse files
docs: add YAML example to OpenAIChatGenerator (#11146)
Co-authored-by: Julian Risch <julian.risch@deepset.ai>
1 parent 5bd14f3 commit f297288

1 file changed

Lines changed: 38 additions & 0 deletions

File tree

docs-website/docs/pipeline-components/generators/openaichatgenerator.mdx

Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -229,6 +229,44 @@ pipe.run(data={"prompt_builder": {"template_variables":{"location": location}, "
229229
>> 'cached_tokens': 0}}})]}}
230230
```
231231

232+
### In YAML
233+
234+
This is the YAML representation of the pipeline shown above. It dynamically constructs a prompt and generates an answer using a chat model.
235+
236+
```yaml
237+
components:
238+
llm:
239+
init_parameters:
240+
api_base_url: null
241+
api_key:
242+
env_vars:
243+
- OPENAI_API_KEY
244+
strict: true
245+
type: env_var
246+
generation_kwargs: {}
247+
http_client_kwargs: null
248+
max_retries: null
249+
model: gpt-4o-mini
250+
organization: null
251+
streaming_callback: null
252+
timeout: null
253+
tools: null
254+
tools_strict: false
255+
type: haystack.components.generators.chat.openai.OpenAIChatGenerator
256+
prompt_builder:
257+
init_parameters:
258+
required_variables: null
259+
template: null
260+
variables: null
261+
type: haystack.components.builders.chat_prompt_builder.ChatPromptBuilder
262+
connection_type_validation: true
263+
connections:
264+
- receiver: llm.messages
265+
sender: prompt_builder.prompt
266+
max_runs_per_component: 100
267+
metadata: {}
268+
```
269+
232270
## Additional References
233271

234272
:notebook: Tutorial: [Building a Chat Application with Function Calling](https://haystack.deepset.ai/tutorials/40_building_chat_application_with_function_calling)

0 commit comments

Comments
 (0)