You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs-website/docs/pipeline-components/agents-1/agent.mdx
+37-1Lines changed: 37 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -54,6 +54,40 @@ You can additionally configure:
54
54
For a complete list of available parameters, refer to the [Agents API Documentation](/reference/agents-api).
55
55
:::
56
56
57
+
### Agents as Tools
58
+
59
+
You can wrap an `Agent` using [`ComponentTool`](../../tools/componenttool.mdx) to create multi-agent systems where specialized agents act as tools for a coordinator agent.
60
+
61
+
When wrapping an `Agent` as a `ComponentTool`, use the `outputs_to_string` parameter with `{"source": "last_message"}` to extract only the agent's final response text, rather than the execution trace with tool calls to keep the coordinator agent's context clean and focused.
62
+
63
+
```python
64
+
## Wrap the agent as a ComponentTool with outputs_to_string
65
+
research_tool = ComponentTool(
66
+
component=research_agent, # another agent component
67
+
name="research_specialist",
68
+
description="A specialist that can research topics from the knowledge base",
69
+
outputs_to_string={"source": "last_message"} ## Extract only the final response
70
+
)
71
+
72
+
## Create a coordinator agent that uses the specialist
system_prompt="You are a coordinator that delegates research tasks to a specialist.",
77
+
exit_conditions=["text"]
78
+
)
79
+
80
+
## Warm up and run
81
+
research_agent.warm_up()
82
+
coordinator_agent.warm_up()
83
+
84
+
result = coordinator_agent.run(
85
+
messages=[ChatMessage.from_user("Tell me about Haystack")]
86
+
)
87
+
88
+
print(result["last_message"].text)
89
+
```
90
+
57
91
### Streaming
58
92
59
93
You can stream output as it’s generated. Pass a callback to `streaming_callback`. Use the built-in `print_streaming_chunk` to print text tokens and tool events (tool calls and tool results).
Copy file name to clipboardExpand all lines: docs-website/versioned_docs/version-2.20-unstable/pipeline-components/agents-1/agent.mdx
+37-1Lines changed: 37 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -54,6 +54,40 @@ You can additionally configure:
54
54
For a complete list of available parameters, refer to the [Agents API Documentation](/reference/agents-api).
55
55
:::
56
56
57
+
### Agents as Tools
58
+
59
+
You can wrap an `Agent` using [`ComponentTool`](../../tools/componenttool.mdx) to create multi-agent systems where specialized agents act as tools for a coordinator agent.
60
+
61
+
When wrapping an `Agent` as a `ComponentTool`, use the `outputs_to_string` parameter with `{"source": "last_message"}` to extract only the agent's final response text, rather than the execution trace with tool calls to keep the coordinator agent's context clean and focused.
62
+
63
+
```python
64
+
## Wrap the agent as a ComponentTool with outputs_to_string
65
+
research_tool = ComponentTool(
66
+
component=research_agent, # another agent component
67
+
name="research_specialist",
68
+
description="A specialist that can research topics from the knowledge base",
69
+
outputs_to_string={"source": "last_message"} ## Extract only the final response
70
+
)
71
+
72
+
## Create a coordinator agent that uses the specialist
system_prompt="You are a coordinator that delegates research tasks to a specialist.",
77
+
exit_conditions=["text"]
78
+
)
79
+
80
+
## Warm up and run
81
+
research_agent.warm_up()
82
+
coordinator_agent.warm_up()
83
+
84
+
result = coordinator_agent.run(
85
+
messages=[ChatMessage.from_user("Tell me about Haystack")]
86
+
)
87
+
88
+
print(result["last_message"].text)
89
+
```
90
+
57
91
### Streaming
58
92
59
93
You can stream output as it’s generated. Pass a callback to `streaming_callback`. Use the built-in `print_streaming_chunk` to print text tokens and tool events (tool calls and tool results).
0 commit comments