diff --git a/docs-website/docs/pipeline-components/builders/promptbuilder.mdx b/docs-website/docs/pipeline-components/builders/promptbuilder.mdx index dabb00396e..c71ecd4f35 100644 --- a/docs-website/docs/pipeline-components/builders/promptbuilder.mdx +++ b/docs-website/docs/pipeline-components/builders/promptbuilder.mdx @@ -257,6 +257,45 @@ p.run( Note that `language_template` introduces `answer_language` variable which is not bound to any pipeline variable. If not set otherwise, it would use its default value, "English". In this example, we overwrite its value to "German". The `template_variables` allows you to overwrite pipeline variables (such as documents) as well. +### In YAML + +This is the YAML representation of the RAG pipeline shown above. It renders a custom prompt template by filling it with the contents of retrieved documents and a query, then sends the rendered prompt to a generator. + +```yaml +components: + llm: + init_parameters: + api_base_url: null + api_key: + env_vars: + - OPENAI_API_KEY + strict: true + type: env_var + generation_kwargs: {} + http_client_kwargs: null + max_retries: null + model: gpt-5-mini + organization: null + streaming_callback: null + system_prompt: null + timeout: null + type: haystack.components.generators.openai.OpenAIGenerator + prompt_builder: + init_parameters: + required_variables: null + template: "\n Given these documents, answer the question.\\nDocuments:\n\ + \ {% for doc in documents %}\n {{ doc.content }}\n {% endfor\ + \ %}\n\n \\nQuestion: {{query}}\n \\nAnswer:\n " + variables: null + type: haystack.components.builders.prompt_builder.PromptBuilder +connection_type_validation: true +connections: +- receiver: llm.prompt + sender: prompt_builder.prompt +max_runs_per_component: 100 +metadata: {} +``` + ## Additional References 🧑‍🍳 Cookbooks: