diff --git a/docs-website/reference/haystack-api/generators_api.md b/docs-website/reference/haystack-api/generators_api.md
index 48a137a450..651a692d4d 100644
--- a/docs-website/reference/haystack-api/generators_api.md
+++ b/docs-website/reference/haystack-api/generators_api.md
@@ -1221,8 +1221,8 @@ __init__(
*,
chat_generator: ChatGenerator,
system_prompt: str | None = None,
- user_prompt: str | None = None,
- required_variables: list[str] | Literal["*"] | None = None,
+ user_prompt: str,
+ required_variables: list[str] | Literal["*"] = "*",
streaming_callback: StreamingCallbackT | None = None
) -> None
```
@@ -1233,12 +1233,18 @@ Initialize the LLM component.
- **chat_generator** (ChatGenerator) – An instance of the chat generator that the LLM should use.
- **system_prompt** (str | None) – System prompt for the LLM.
-- **user_prompt** (str | None) – User prompt for the LLM. If provided this is appended to the messages provided at runtime.
-- **required_variables** (list\[str\] | Literal['\*'] | None) – List variables that must be provided as input to user_prompt.
+- **user_prompt** (str) – User prompt for the LLM. Must contain at least one Jinja2 template variable
+ (e.g., `{{ variable_name }}`). This prompt is appended to the messages provided at runtime.
+- **required_variables** (list\[str\] | Literal['\*']) – Variables that must be provided as input to user_prompt.
If a variable listed as required is not provided, an exception is raised.
- If set to `"*"`, all variables found in the prompt are required. Optional.
+ If set to `"*"`, all variables found in the prompt are required. Defaults to `"*"`.
- **streaming_callback** (StreamingCallbackT | None) – A callback that will be invoked when a response is streamed from the LLM.
+**Raises:**
+
+- ValueError – If user_prompt contains no template variables.
+- ValueError – If required_variables is an empty list.
+
#### to_dict
```python