Skip to content

Commit 7a7011b

Browse files
dfokinabilgeyucel
andauthored
docs: add a Multi-Agent with ComponentTool example (#10032)
* agent with componenttool example * update "conversation history" Co-authored-by: Bilge Yücel <bilgeyucel96@gmail.com> * shorten tool example Co-authored-by: Bilge Yücel <bilgeyucel96@gmail.com> * move section & update other version --------- Co-authored-by: Bilge Yücel <bilgeyucel96@gmail.com>
1 parent 5aeec2a commit 7a7011b

2 files changed

Lines changed: 74 additions & 2 deletions

File tree

  • docs-website
    • docs/pipeline-components/agents-1
    • versioned_docs/version-2.20-unstable/pipeline-components/agents-1

docs-website/docs/pipeline-components/agents-1/agent.mdx

Lines changed: 37 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,40 @@ You can additionally configure:
5454
For a complete list of available parameters, refer to the [Agents API Documentation](/reference/agents-api).
5555
:::
5656

57+
### Agents as Tools
58+
59+
You can wrap an `Agent` using [`ComponentTool`](../../tools/componenttool.mdx) to create multi-agent systems where specialized agents act as tools for a coordinator agent.
60+
61+
When wrapping an `Agent` as a `ComponentTool`, use the `outputs_to_string` parameter with `{"source": "last_message"}` to extract only the agent's final response text, rather than the execution trace with tool calls to keep the coordinator agent's context clean and focused.
62+
63+
```python
64+
## Wrap the agent as a ComponentTool with outputs_to_string
65+
research_tool = ComponentTool(
66+
component=research_agent, # another agent component
67+
name="research_specialist",
68+
description="A specialist that can research topics from the knowledge base",
69+
outputs_to_string={"source": "last_message"} ## Extract only the final response
70+
)
71+
72+
## Create a coordinator agent that uses the specialist
73+
coordinator_agent = Agent(
74+
chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
75+
tools=[research_tool],
76+
system_prompt="You are a coordinator that delegates research tasks to a specialist.",
77+
exit_conditions=["text"]
78+
)
79+
80+
## Warm up and run
81+
research_agent.warm_up()
82+
coordinator_agent.warm_up()
83+
84+
result = coordinator_agent.run(
85+
messages=[ChatMessage.from_user("Tell me about Haystack")]
86+
)
87+
88+
print(result["last_message"].text)
89+
```
90+
5791
### Streaming
5892

5993
You can stream output as it’s generated. Pass a callback to `streaming_callback`. Use the built-in `print_streaming_chunk` to print text tokens and tool events (tool calls and tool results).
@@ -197,4 +231,6 @@ print(agent_output["database_agent"]["messages"][-1].text)
197231

198232
🧑‍🍳 Cookbook: [Build a GitHub Issue Resolver Agent](https://haystack.deepset.ai/cookbook/github_issue_resolver_agent)
199233

200-
📓 Tutorial: [Build a Tool-Calling Agent](https://haystack.deepset.ai/tutorials/43_building_a_tool_calling_agent)
234+
📓 Tutorials:
235+
- [Build a Tool-Calling Agent](https://haystack.deepset.ai/tutorials/43_building_a_tool_calling_agent)
236+
- [Creating a Multi-Agent System](https://haystack.deepset.ai/tutorials/45_creating_a_multi_agent_system)

docs-website/versioned_docs/version-2.20-unstable/pipeline-components/agents-1/agent.mdx

Lines changed: 37 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,40 @@ You can additionally configure:
5454
For a complete list of available parameters, refer to the [Agents API Documentation](/reference/agents-api).
5555
:::
5656

57+
### Agents as Tools
58+
59+
You can wrap an `Agent` using [`ComponentTool`](../../tools/componenttool.mdx) to create multi-agent systems where specialized agents act as tools for a coordinator agent.
60+
61+
When wrapping an `Agent` as a `ComponentTool`, use the `outputs_to_string` parameter with `{"source": "last_message"}` to extract only the agent's final response text, rather than the execution trace with tool calls to keep the coordinator agent's context clean and focused.
62+
63+
```python
64+
## Wrap the agent as a ComponentTool with outputs_to_string
65+
research_tool = ComponentTool(
66+
component=research_agent, # another agent component
67+
name="research_specialist",
68+
description="A specialist that can research topics from the knowledge base",
69+
outputs_to_string={"source": "last_message"} ## Extract only the final response
70+
)
71+
72+
## Create a coordinator agent that uses the specialist
73+
coordinator_agent = Agent(
74+
chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
75+
tools=[research_tool],
76+
system_prompt="You are a coordinator that delegates research tasks to a specialist.",
77+
exit_conditions=["text"]
78+
)
79+
80+
## Warm up and run
81+
research_agent.warm_up()
82+
coordinator_agent.warm_up()
83+
84+
result = coordinator_agent.run(
85+
messages=[ChatMessage.from_user("Tell me about Haystack")]
86+
)
87+
88+
print(result["last_message"].text)
89+
```
90+
5791
### Streaming
5892

5993
You can stream output as it’s generated. Pass a callback to `streaming_callback`. Use the built-in `print_streaming_chunk` to print text tokens and tool events (tool calls and tool results).
@@ -197,4 +231,6 @@ print(agent_output["database_agent"]["messages"][-1].text)
197231

198232
🧑‍🍳 Cookbook: [Build a GitHub Issue Resolver Agent](https://haystack.deepset.ai/cookbook/github_issue_resolver_agent)
199233

200-
📓 Tutorial: [Build a Tool-Calling Agent](https://haystack.deepset.ai/tutorials/43_building_a_tool_calling_agent)
234+
📓 Tutorials:
235+
- [Build a Tool-Calling Agent](https://haystack.deepset.ai/tutorials/43_building_a_tool_calling_agent)
236+
- [Creating a Multi-Agent System](https://haystack.deepset.ai/tutorials/45_creating_a_multi_agent_system)

0 commit comments

Comments
 (0)