Skip to content

Commit c3f682c

Browse files
docs: sync Haystack API reference on Docusaurus (#11153)
Co-authored-by: julian-risch <4181769+julian-risch@users.noreply.github.com>
1 parent 0dd8806 commit c3f682c

1 file changed

Lines changed: 11 additions & 5 deletions

File tree

docs-website/reference/haystack-api/generators_api.md

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1221,8 +1221,8 @@ __init__(
12211221
*,
12221222
chat_generator: ChatGenerator,
12231223
system_prompt: str | None = None,
1224-
user_prompt: str | None = None,
1225-
required_variables: list[str] | Literal["*"] | None = None,
1224+
user_prompt: str,
1225+
required_variables: list[str] | Literal["*"] = "*",
12261226
streaming_callback: StreamingCallbackT | None = None
12271227
) -> None
12281228
```
@@ -1233,12 +1233,18 @@ Initialize the LLM component.
12331233

12341234
- **chat_generator** (<code>ChatGenerator</code>) – An instance of the chat generator that the LLM should use.
12351235
- **system_prompt** (<code>str | None</code>) – System prompt for the LLM.
1236-
- **user_prompt** (<code>str | None</code>) – User prompt for the LLM. If provided this is appended to the messages provided at runtime.
1237-
- **required_variables** (<code>list\[str\] | Literal['\*'] | None</code>) – List variables that must be provided as input to user_prompt.
1236+
- **user_prompt** (<code>str</code>) – User prompt for the LLM. Must contain at least one Jinja2 template variable
1237+
(e.g., `{{ variable_name }}`). This prompt is appended to the messages provided at runtime.
1238+
- **required_variables** (<code>list\[str\] | Literal['\*']</code>) – Variables that must be provided as input to user_prompt.
12381239
If a variable listed as required is not provided, an exception is raised.
1239-
If set to `"*"`, all variables found in the prompt are required. Optional.
1240+
If set to `"*"`, all variables found in the prompt are required. Defaults to `"*"`.
12401241
- **streaming_callback** (<code>StreamingCallbackT | None</code>) – A callback that will be invoked when a response is streamed from the LLM.
12411242

1243+
**Raises:**
1244+
1245+
- <code>ValueError</code> – If user_prompt contains no template variables.
1246+
- <code>ValueError</code> – If required_variables is an empty list.
1247+
12421248
#### to_dict
12431249

12441250
```python

0 commit comments

Comments
 (0)