Skip to content

Commit 1a30641

Browse files
committed
docs: add YAML example to PromptBuilder documentation
1 parent 428bcab commit 1a30641

1 file changed

Lines changed: 78 additions & 0 deletions

File tree

docs-website/docs/pipeline-components/builders/promptbuilder.mdx

Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -188,6 +188,45 @@ result = p.run({"prompt_builder": {"documents": documents, "query": question}})
188188
print(result)
189189
```
190190

191+
#### In YAML
192+
193+
This is the YAML representation of the RAG pipeline shown above. It renders a custom prompt template by filling it with the contents of retrieved documents and a query, then sends the rendered prompt to a generator.
194+
195+
```yaml
196+
components:
197+
llm:
198+
init_parameters:
199+
api_base_url: null
200+
api_key:
201+
env_vars:
202+
- OPENAI_API_KEY
203+
strict: true
204+
type: env_var
205+
generation_kwargs: {}
206+
http_client_kwargs: null
207+
max_retries: null
208+
model: gpt-5-mini
209+
organization: null
210+
streaming_callback: null
211+
system_prompt: null
212+
timeout: null
213+
type: haystack.components.generators.openai.OpenAIGenerator
214+
prompt_builder:
215+
init_parameters:
216+
required_variables: null
217+
template: "\n Given these documents, answer the question.\\nDocuments:\n\
218+
\ {% for doc in documents %}\n {{ doc.content }}\n {% endfor\
219+
\ %}\n\n \\nQuestion: {{query}}\n \\nAnswer:\n "
220+
variables: null
221+
type: haystack.components.builders.prompt_builder.PromptBuilder
222+
connection_type_validation: true
223+
connections:
224+
- receiver: llm.prompt
225+
sender: prompt_builder.prompt
226+
max_runs_per_component: 100
227+
metadata: {}
228+
```
229+
191230
#### Changing the template at runtime (Prompt Engineering)
192231
193232
`PromptBuilder` allows you to switch the prompt template of an existing pipeline. The example below builds on top of the existing pipeline in the previous section. We are invoking the existing pipeline with a new prompt template:
@@ -257,6 +296,45 @@ p.run(
257296
Note that `language_template` introduces `answer_language` variable which is not bound to any pipeline variable. If not set otherwise, it would use its default value, "English". In this example, we overwrite its value to "German".
258297
The `template_variables` allows you to overwrite pipeline variables (such as documents) as well.
259298

299+
### In YAML
300+
301+
This is the YAML representation of the RAG pipeline shown above. It renders a custom prompt template by filling it with the contents of retrieved documents and a query, then sends the rendered prompt to a generator.
302+
303+
```yaml
304+
components:
305+
llm:
306+
init_parameters:
307+
api_base_url: null
308+
api_key:
309+
env_vars:
310+
- OPENAI_API_KEY
311+
strict: true
312+
type: env_var
313+
generation_kwargs: {}
314+
http_client_kwargs: null
315+
max_retries: null
316+
model: gpt-5-mini
317+
organization: null
318+
streaming_callback: null
319+
system_prompt: null
320+
timeout: null
321+
type: haystack.components.generators.openai.OpenAIGenerator
322+
prompt_builder:
323+
init_parameters:
324+
required_variables: null
325+
template: "\n Given these documents, answer the question.\\nDocuments:\n\
326+
\ {% for doc in documents %}\n {{ doc.content }}\n {% endfor\
327+
\ %}\n\n \\nQuestion: {{query}}\n \\nAnswer:\n "
328+
variables: null
329+
type: haystack.components.builders.prompt_builder.PromptBuilder
330+
connection_type_validation: true
331+
connections:
332+
- receiver: llm.prompt
333+
sender: prompt_builder.prompt
334+
max_runs_per_component: 100
335+
metadata: {}
336+
```
337+
260338
## Additional References
261339

262340
🧑‍🍳 Cookbooks:

0 commit comments

Comments
 (0)