Skip to content

Commit ce62d2d

Browse files
authored
Document custom LLM context messages for tools
Added details on custom LLM context messages for tools, including how to override the default message with specific information about the generated embed.
1 parent f8a47ed commit ce62d2d

File tree

1 file changed

+24
-0
lines changed
  • docs/features/extensibility/plugin/development

1 file changed

+24
-0
lines changed

docs/features/extensibility/plugin/development/rich-ui.mdx

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,30 @@ def create_visualization_tool(self, data: str) -> HTMLResponse:
4646
return HTMLResponse(content=html_content, headers=headers)
4747
```
4848

49+
### Custom LLM Context Messages (Tools)
50+
51+
By default, when a Tool successfully returns an HTMLResponse embed, the LLM is sent a generic system message: "{tool_function_name}: Embedded UI result is active and visible to the user."
52+
53+
While this prevents the LLM from trying to read raw HTML, it also hides all the context of what actually happened. If you want the LLM to know specific details about the generated embed—like the parameters used, the status of the action, or what the user is seeing—you can override this generic message using the optional Tool-Result-Message header.
54+
55+
```python
56+
from fastapi.responses import HTMLResponse
57+
def generate_music_tool(self, genre: str, duration: int) -> HTMLResponse:
58+
"""
59+
Generates music and embeds a player in the chat.
60+
"""
61+
# Your HTML player generation logic
62+
html_content = f"<html><body><audio controls src='...'></audio></body></html>"
63+
headers = {
64+
"Content-Disposition": "inline",
65+
# This message is sent to the LLM instead of the generic default
66+
"Tool-Result-Message": f"Audio player embedded successfully. A {duration}-second {genre} track was generated. The user can play or download it from the player above."
67+
}
68+
69+
return HTMLResponse(content=html_content, headers=headers)
70+
```
71+
72+
This ensures the LLM receives actionable, semantic feedback about the visualization or embedded UI, allowing it to provide relevant follow-up responses to the user.
4973
## Action Usage
5074

5175
Actions work exactly the same way. The rich UI embed is delivered to the chat via the event emitter:

0 commit comments

Comments
 (0)