Skip to content

Commit 6bc41cb

Browse files
docs: remove unnecessary warm_up() call from document embedder examples
1 parent 48ebdd6 commit 6bc41cb

File tree

1 file changed

+277
-1
lines changed

1 file changed

+277
-1
lines changed

docs-website/versioned_docs/version-2.25/overview/migrating-from-langgraphlangchain-to-haystack.mdx

Lines changed: 277 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Here's a table of key concepts and their approximate equivalents between the two
4444
| Model Context Protocol `load_mcp_tools` `MultiServerMCPClient` | Model Context Protocol - `MCPTool`, `MCPToolset`, `StdioServerInfo`, `StreamableHttpServerInfo` | Haystack provides [various MCP primitives](https://haystack.deepset.ai/integrations/mcp) for connecting multiple MCP servers and organizing MCP toolsets. |
4545
| Memory (State, short-term, long-term) | Memory (Agent State, short-term, long-term) | Agent [State](../concepts/agents/state.mdx) provides a structured way to share data between tools and store intermediate results during agent execution. For short-term memory, Haystack offers a [ChatMessage Store](/reference/experimental-chatmessage-store-api) to persist chat history. More memory options are coming soon. |
4646
| Time travel (Checkpoints) | Breakpoints (Breakpoint, AgentBreakpoint, ToolBreakpoint, Snapshot) | [Breakpoints](../concepts/pipelines/pipeline-breakpoints.mdx) let you pause, inspect, modify, and resume a pipeline, agent, or tool for debugging or iterative development. |
47-
| Human-in-the-Loop (Interrupts / Commands) | Human-in-the-loop ( ConfirmationStrategy / ConfirmationPolicy) | (Experimental) Haystack uses [confirmation strategies](https://haystack.deepset.ai/tutorials/47_human_in_the_loop_agent) to pause or block the execution to gather user feedback |
47+
| Human-in-the-Loop (Interrupts / Commands) | Human-in-the-loop ( ConfirmationStrategy / ConfirmationPolicy) | Haystack uses [confirmation strategies](https://haystack.deepset.ai/tutorials/47_human_in_the_loop_agent) to pause or block the execution to gather user feedback |
4848

4949
## Ecosystem and Tooling Mapping: LangChain → Haystack
5050

@@ -373,6 +373,282 @@ for m in messages["messages"]:
373373
</div>
374374
</div>
375375

376+
### Creating Agents
377+
378+
The [Agentic Flows](#agentic-flows-with-haystack-vs-langgraph) walkthrough above showed how to assemble an agent loop manually from pipeline primitives. Haystack also provides a high-level `Agent` class that wraps the full loop - LLM calls, tool invocation, and iteration - into a single component. LangGraph offers an equivalent shortcut through `create_react_agent` in `langgraph.prebuilt`. Both produce a ReAct-style agent that handles tool calling and multi-step reasoning automatically.
379+
380+
<div className="code-comparison">
381+
<div className="code-comparison__column">
382+
<CodeBlock language="python" title="Haystack">{`# pip install haystack-ai anthropic-haystack
383+
384+
from haystack.components.agents import Agent
385+
from haystack_integrations.components.generators.anthropic import AnthropicChatGenerator
386+
from haystack.dataclasses import ChatMessage
387+
from haystack.tools import tool
388+
389+
@tool
390+
def multiply(a: int, b: int) -> int:
391+
"""Multiply \`a\` and \`b\`."""
392+
return a * b
393+
394+
@tool
395+
def add(a: int, b: int) -> int:
396+
"""Add \`a\` and \`b\`."""
397+
return a + b
398+
399+
# Create an agent - the agentic loop is handled automatically
400+
agent = Agent(
401+
chat_generator=AnthropicChatGenerator(
402+
model="claude-sonnet-4-5-20250929",
403+
generation_kwargs={"temperature": 0},
404+
),
405+
tools=[multiply, add],
406+
system_prompt="You are a helpful assistant that performs arithmetic.",
407+
)
408+
409+
result = agent.run(messages=[
410+
ChatMessage.from_user("What is 3 multiplied by 7, then add 5?")
411+
])
412+
print(result["messages"][-1].text)`}</CodeBlock>
413+
</div>
414+
<div className="code-comparison__column">
415+
<CodeBlock language="python" title="LangGraph + LangChain">{`# pip install langchain-anthropic langgraph
416+
417+
from langchain_anthropic import ChatAnthropic
418+
from langchain_core.tools import tool
419+
from langchain.agents import create_agent
420+
from langchain_core.messages import HumanMessage, SystemMessage
421+
422+
@tool
423+
def multiply(a: int, b: int) -> int:
424+
"""Multiply \`a\` and \`b\`."""
425+
return a * b
426+
427+
@tool
428+
def add(a: int, b: int) -> int:
429+
"""Add \`a\` and \`b\`."""
430+
return a + b
431+
432+
# Create an agent - the agentic loop is handled automatically
433+
model = ChatAnthropic(
434+
model="claude-sonnet-4-5-20250929",
435+
temperature=0,
436+
)
437+
agent = create_agent(
438+
model,
439+
tools=[multiply, add],
440+
system_prompt=SystemMessage(
441+
content="You are a helpful assistant that performs arithmetic."
442+
),
443+
)
444+
445+
result = agent.invoke({
446+
"messages": [HumanMessage(content="What is 3 multiplied by 7, then add 5?")]
447+
})
448+
print(result["messages"][-1].content)`}</CodeBlock>
449+
</div>
450+
</div>
451+
452+
### Connecting to Document Stores
453+
454+
Document stores are the foundation of retrieval-augmented generation (RAG). In Haystack, document stores integrate natively with pipeline components like Retrievers and Prompt Builders via explicit typed connections. LangChain centers retrieval around its vector store abstraction composed using LCEL (LangChain Expression Language).
455+
456+
Both frameworks offer in-memory stores for prototyping and a wide range of production backends (Elasticsearch, Qdrant, Weaviate, Pinecone, and more) via integrations.
457+
458+
**Step 1: Create a document store and add documents**
459+
460+
<div className="code-comparison">
461+
<div className="code-comparison__column">
462+
<CodeBlock language="python" title="Haystack">{`# pip install haystack-ai sentence-transformers
463+
464+
from haystack import Document
465+
from haystack.document_stores.in_memory import InMemoryDocumentStore
466+
from haystack.components.embedders import SentenceTransformersDocumentEmbedder
467+
468+
# Embed and write documents to the document store
469+
document_store = InMemoryDocumentStore()
470+
471+
doc_embedder = SentenceTransformersDocumentEmbedder(
472+
model="sentence-transformers/all-MiniLM-L6-v2"
473+
)
474+
475+
docs = [
476+
Document(content="Paris is the capital of France."),
477+
Document(content="Berlin is the capital of Germany."),
478+
Document(content="Tokyo is the capital of Japan."),
479+
]
480+
docs_with_embeddings = doc_embedder.run(docs)["documents"]
481+
document_store.write_documents(docs_with_embeddings)`}</CodeBlock>
482+
</div>
483+
<div className="code-comparison__column">
484+
<CodeBlock language="python" title="LangChain">{`# pip install langchain-community langchain-huggingface sentence-transformers
485+
486+
from langchain_huggingface import HuggingFaceEmbeddings
487+
from langchain_community.vectorstores import InMemoryVectorStore
488+
from langchain_core.documents import Document
489+
490+
# Embed and add documents to the vector store
491+
embeddings = HuggingFaceEmbeddings(
492+
model_name="sentence-transformers/all-MiniLM-L6-v2"
493+
)
494+
vectorstore = InMemoryVectorStore(embedding=embeddings)
495+
vectorstore.add_documents([
496+
Document(page_content="Paris is the capital of France."),
497+
Document(page_content="Berlin is the capital of Germany."),
498+
Document(page_content="Tokyo is the capital of Japan."),
499+
])`}</CodeBlock>
500+
</div>
501+
</div>
502+
503+
**Step 2: Build a RAG pipeline**
504+
505+
<div className="code-comparison">
506+
<div className="code-comparison__column">
507+
<CodeBlock language="python" title="Haystack">{`from haystack import Pipeline
508+
from haystack.components.embedders import SentenceTransformersTextEmbedder
509+
from haystack.components.retrievers.in_memory import InMemoryEmbeddingRetriever
510+
from haystack.components.builders import ChatPromptBuilder
511+
from haystack.dataclasses import ChatMessage
512+
from haystack_integrations.components.generators.anthropic import AnthropicChatGenerator
513+
514+
# ChatPromptBuilder expects a List[ChatMessage] as template
515+
template = [ChatMessage.from_user("""
516+
Given the following documents, answer the question.
517+
{% for doc in documents %}{{ doc.content }}{% endfor %}
518+
Question: {{ question }}
519+
""")]
520+
521+
rag_pipeline = Pipeline()
522+
rag_pipeline.add_component(
523+
"text_embedder",
524+
SentenceTransformersTextEmbedder(model="sentence-transformers/all-MiniLM-L6-v2")
525+
)
526+
rag_pipeline.add_component(
527+
"retriever", InMemoryEmbeddingRetriever(document_store=document_store)
528+
)
529+
rag_pipeline.add_component(
530+
"prompt_builder", ChatPromptBuilder(template=template)
531+
)
532+
rag_pipeline.add_component(
533+
"llm", AnthropicChatGenerator(model="claude-sonnet-4-5-20250929")
534+
)
535+
536+
rag_pipeline.connect("text_embedder.embedding", "retriever.query_embedding")
537+
rag_pipeline.connect("retriever.documents", "prompt_builder.documents")
538+
rag_pipeline.connect("prompt_builder.prompt", "llm.messages")
539+
540+
result = rag_pipeline.run({
541+
"text_embedder": {"text": "What is the capital of France?"},
542+
"prompt_builder": {"question": "What is the capital of France?"},
543+
})
544+
print(result["llm"]["replies"][0].text)`}</CodeBlock>
545+
</div>
546+
<div className="code-comparison__column">
547+
<CodeBlock language="python" title="LangChain">{`from langchain_anthropic import ChatAnthropic
548+
from langchain_core.prompts import ChatPromptTemplate
549+
from langchain_core.output_parsers import StrOutputParser
550+
from langchain_core.runnables import RunnablePassthrough
551+
552+
def format_docs(docs):
553+
return "\\n".join(doc.page_content for doc in docs)
554+
555+
retriever = vectorstore.as_retriever()
556+
model = ChatAnthropic(model="claude-sonnet-4-5-20250929")
557+
558+
template = """
559+
Given the following documents, answer the question.
560+
{context}
561+
Question: {question}
562+
"""
563+
prompt = ChatPromptTemplate.from_template(template)
564+
565+
rag_chain = (
566+
{"context": retriever | format_docs, "question": RunnablePassthrough()}
567+
| prompt
568+
| model
569+
| StrOutputParser()
570+
)
571+
572+
result = rag_chain.invoke("What is the capital of France?")
573+
print(result)`}</CodeBlock>
574+
</div>
575+
</div>
576+
577+
### Using MCP Tools
578+
579+
Both frameworks support the [Model Context Protocol (MCP)](https://modelcontextprotocol.io), letting agents connect to external tools and services exposed by MCP servers. Haystack provides `MCPTool` and `MCPToolset` through the `mcp-haystack` integration package, which plug directly into the `Agent` component. LangChain's MCP support relies on the separate `langchain-mcp-adapters` package and requires an async workflow throughout.
580+
581+
<div className="code-comparison">
582+
<div className="code-comparison__column">
583+
<CodeBlock language="python" title="Haystack">{`# pip install haystack-ai mcp-haystack anthropic-haystack
584+
585+
from haystack_integrations.tools.mcp import MCPToolset, StdioServerInfo
586+
from haystack.components.agents import Agent
587+
from haystack_integrations.components.generators.anthropic import AnthropicChatGenerator
588+
from haystack.dataclasses import ChatMessage
589+
590+
# Connect to an MCP server - tools are auto-discovered
591+
toolset = MCPToolset(
592+
server_info=StdioServerInfo(
593+
command="uvx",
594+
args=["mcp-server-fetch"],
595+
)
596+
)
597+
598+
agent = Agent(
599+
chat_generator=AnthropicChatGenerator(model="claude-sonnet-4-5-20250929"),
600+
tools=toolset,
601+
system_prompt="You are a helpful assistant that can fetch web content.",
602+
)
603+
604+
result = agent.run(messages=[
605+
ChatMessage.from_user("Fetch the content from https://haystack.deepset.ai")
606+
])
607+
print(result["messages"][-1].text)`}</CodeBlock>
608+
</div>
609+
<div className="code-comparison__column">
610+
<CodeBlock language="python" title="LangGraph + LangChain">{`# pip install langchain-mcp-adapters langgraph langchain-anthropic
611+
612+
import asyncio
613+
from langchain_mcp_adapters.client import MultiServerMCPClient
614+
from langchain.agents import create_agent
615+
from langchain_anthropic import ChatAnthropic
616+
from langchain_core.messages import HumanMessage, SystemMessage
617+
618+
model = ChatAnthropic(model="claude-sonnet-4-5-20250929")
619+
620+
async def run():
621+
client = MultiServerMCPClient(
622+
{
623+
"fetch": {
624+
"command": "uvx",
625+
"args": ["mcp-server-fetch"],
626+
"transport": "stdio",
627+
}
628+
}
629+
)
630+
tools = await client.get_tools()
631+
agent = create_agent(
632+
model,
633+
tools,
634+
system_prompt=SystemMessage(
635+
content="You are a helpful assistant that can fetch web content."
636+
),
637+
)
638+
result = await agent.ainvoke(
639+
{
640+
"messages": [
641+
HumanMessage(content="Fetch the content from https://haystack.deepset.ai")
642+
]
643+
}
644+
)
645+
print(result["messages"][-1].content)
646+
647+
648+
asyncio.run(run())`}</CodeBlock>
649+
</div>
650+
</div>
651+
376652
## Hear from Haystack Users
377653

378654
See how teams across industries use Haystack to power their production AI systems, from RAG applications to agentic workflows.

0 commit comments

Comments
 (0)