Skip to content

Commit ae88a6a

Browse files
docs: fix OpenAPIServiceConnector and OpenAPIServiceToFunctions code snippets + docs (#10851)
* fix: correct OpenAPIServiceConnector and OpenAPIServiceToFunctions code snippets - OpenAPIServiceConnector: use ToolCall and ChatMessage.from_assistant(tool_calls=[...]) instead of invalid ChatMessage.from_assistant(json.dumps(fc_payload)); fix invalid Python serper_token = <your_serper_dev_token> to use placeholder string - OpenAPIServiceToFunctions: use ByteStream.from_string() with inline spec instead of non-existent path/to/openapi_definition.yaml Fixes #10645 * Apply suggestions from code review * fixes * simplify update docs * better docs --------- Co-authored-by: Stefano Fiorucci <stefanofiorucci@gmail.com>
1 parent a5a2bd3 commit ae88a6a

6 files changed

Lines changed: 331 additions & 207 deletions

File tree

docs-website/docs/pipeline-components/connectors/openapiserviceconnector.mdx

Lines changed: 75 additions & 46 deletions
Original file line numberDiff line numberDiff line change
@@ -14,8 +14,8 @@ description: "`OpenAPIServiceConnector` is a component that acts as an interface
1414
| | |
1515
| --- | --- |
1616
| **Most common position in a pipeline** | Flexible |
17-
| **Mandatory run variables** | `messages`: A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects where the last message is expected to carry parameter invocation payload. <br /> <br />`service_openapi_spec`: OpenAPI specification of the service being invoked. It can be YAML/JSON, and all ref values must be resolved. <br /> <br />`service_credentials`: Authentication credentials for the service. We currently support two OpenAPI spec v3 security schemes: <br /> <br />1. http – for Basic, Bearer, and other HTTP authentication schemes; <br />2. apiKey – for API keys and cookie authentication. |
18-
| **Output variables** | `service_response`: A dictionary that is a list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects where each message corresponds to a function invocation. <br />If a user specifies multiple function calling requests, there will be multiple responses. |
17+
| **Mandatory run variables** | `messages`: A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects where the last message must be from the assistant and contain tool calls. <br /> <br />`service_openapi_spec`: OpenAPI specification of the service being invoked. It can be YAML/JSON, and all ref values must be resolved. <br /> <br />`service_credentials`: Authentication credentials for the service. We currently support two OpenAPI spec v3 security schemes: <br /> <br />1. http – for Basic, Bearer, and other HTTP authentication schemes; <br />2. apiKey – for API keys and cookie authentication. |
18+
| **Output variables** | `service_response`: A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects where each message corresponds to a tool call invocation. <br />If a message contains multiple tool calls, there will be multiple responses. |
1919
| **API reference** | [Connectors](/reference/connectors-api) |
2020
| **GitHub link** | https://github.com/deepset-ai/haystack/blob/main/haystack/components/connectors/openapi_service.py |
2121

@@ -37,13 +37,13 @@ pip install openapi3
3737

3838
### On its own
3939

40-
This component is primarily meant to be used in pipelines, as [`OpenAPIServiceToFunctions`](../converters/openapiservicetofunctions.mdx), in tandem with the function calling model, resolves the actual function calling parameters that are injected as invocation parameters for `OpenAPIServiceConnector`.
40+
This component is primarily meant to be used in pipelines, as [`OpenAPIServiceToFunctions`](../converters/openapiservicetofunctions.mdx), in tandem with an LLM with tool calling capabilities, resolves the actual tool call parameters that are injected as invocation parameters for `OpenAPIServiceConnector`.
4141

4242
### In a pipeline
4343

44-
Let's say we're linking the Serper search engine to a pipeline. Here, `OpenAPIServiceConnector` uses the abilities of `OpenAPIServiceToFunctions`. `OpenAPIServiceToFunctions` first fetches and changes the [Serper's OpenAPI specification](https://bit.ly/serper_dev_spec) into a format that OpenAI's function calling mechanism can understand. Then, `OpenAPIServiceConnector` activates the Serper service using this specification.
44+
Let's say we're linking the Serper search engine to a pipeline. Here, `OpenAPIServiceConnector` uses the abilities of `OpenAPIServiceToFunctions`. `OpenAPIServiceToFunctions` first fetches and changes the [Serper's OpenAPI specification](https://bit.ly/serper_dev_spec) into function definitions that an LLM with tool calling capabilities can understand. Then, `OpenAPIServiceConnector` activates the Serper service using this specification.
4545

46-
More precisely, `OpenAPIServiceConnector` dynamically calls methods defined in the Serper OpenAPI specification. This involves reading chat messages or other inputs to extract function call parameters, handling authentication with the Serper service, and making the right API calls. The connector makes sure that the method call follows the Serper API requirements, such as correct formatting requests and handling responses.
46+
More precisely, `OpenAPIServiceConnector` dynamically calls methods defined in the Serper OpenAPI specification. This involves reading chat messages to extract tool call parameters, handling authentication with the Serper service, and making the right API calls. The connector makes sure that the method call follows the Serper API requirements, such as correct formatting requests and handling responses.
4747

4848
Note that we used Serper just as an example here. This could be any OpenAPI-compliant service.
4949

@@ -55,57 +55,86 @@ To run the following code snippet, note that you have to have your own Serper an
5555
import json
5656
import requests
5757

58-
from typing import Dict, Any, List
58+
from typing import Any
59+
5960
from haystack import Pipeline
60-
from haystack.components.generators.utils import print_streaming_chunk
61+
from haystack.components.connectors import OpenAPIServiceConnector
6162
from haystack.components.converters import OpenAPIServiceToFunctions, OutputAdapter
6263
from haystack.components.generators.chat import OpenAIChatGenerator
63-
from haystack.components.connectors import OpenAPIServiceConnector
64-
from haystack.components.fetchers import LinkContentFetcher
65-
from haystack.dataclasses import ChatMessage, ByteStream
66-
from haystack.utils import Secret
64+
from haystack.dataclasses import ChatMessage
65+
from haystack.dataclasses.byte_stream import ByteStream
6766

68-
def prepare_fc_params(openai_functions_schema: Dict[str, Any]) -> Dict[str, Any]:
67+
68+
def prepare_fc_params(openai_functions_schema: dict[str, Any]) -> dict[str, Any]:
6969
return {
70-
"tools": [{
71-
"type": "function",
72-
"function": openai_functions_schema
73-
}],
70+
"tools": [{"type": "function", "function": openai_functions_schema}],
7471
"tool_choice": {
7572
"type": "function",
76-
"function": {"name": openai_functions_schema["name"]}
77-
}
73+
"function": {"name": openai_functions_schema["name"]},
74+
},
7875
}
7976

80-
system_prompt = requests.get("https://bit.ly/serper_dev_system_prompt").text
81-
serper_spec = requests.get("https://bit.ly/serper_dev_spec").text
82-
83-
pipe = Pipeline()
84-
pipe.add_component("spec_to_functions", OpenAPIServiceToFunctions())
85-
pipe.add_component("functions_llm", OpenAIChatGenerator(api_key=Secret.from_token(llm_api_key), model="gpt-3.5-turbo-0613"))
86-
pipe.add_component("openapi_container", OpenAPIServiceConnector())
87-
pipe.add_component("a1", OutputAdapter("{{functions[0] | prepare_fc}}", Dict[str, Any], {"prepare_fc": prepare_fc_params}))
88-
pipe.add_component("a2", OutputAdapter("{{specs[0]}}", Dict[str, Any]))
89-
pipe.add_component("a3", OutputAdapter("{{system_message + service_response}}", List[ChatMessage]))
90-
pipe.add_component("llm", OpenAIChatGenerator(api_key=Secret.from_token(llm_api_key), model="gpt-4-1106-preview", streaming_callback=print_streaming_chunk))
91-
92-
pipe.connect("spec_to_functions.functions", "a1.functions")
93-
pipe.connect("spec_to_functions.openapi_specs", "a2.specs")
94-
pipe.connect("a1", "functions_llm.generation_kwargs")
95-
pipe.connect("functions_llm.replies", "openapi_container.messages")
96-
pipe.connect("a2", "openapi_container.service_openapi_spec")
97-
pipe.connect("openapi_container.service_response", "a3.service_response")
98-
pipe.connect("a3", "llm.messages")
9977

78+
serperdev_spec = requests.get("https://bit.ly/serper_dev_spec").json()
79+
system_prompt = requests.get("https://bit.ly/serper_dev_system").text
10080
user_prompt = "Why was Sam Altman ousted from OpenAI?"
10181

102-
result = pipe.run(data={"functions_llm": {"messages":[ChatMessage.from_system("Only do function calling"), ChatMessage.from_user(user_prompt)]},
103-
"openapi_container": {"service_credentials": serper_dev_key},
104-
"spec_to_functions": {"sources": [ByteStream.from_string(serper_spec)]},
105-
"a3": {"system_message": [ChatMessage.from_system(system_prompt)]}})
106-
107-
>Sam Altman was ousted from OpenAI on November 17, 2023, following
108-
>a "deliberative review process" by the board of directors. The board concluded
109-
>that he was not "consistently candid in his communications". However, he
110-
>returned as CEO just days after his ouster.
82+
pipe = Pipeline()
83+
pipe.add_component("spec_to_functions", OpenAPIServiceToFunctions())
84+
pipe.add_component(
85+
"prepare_fc_adapter",
86+
OutputAdapter(
87+
"{{functions[0] | prepare_fc}}",
88+
dict[str, Any],
89+
{"prepare_fc": prepare_fc_params},
90+
),
91+
)
92+
pipe.add_component("functions_llm", OpenAIChatGenerator())
93+
pipe.add_component("openapi_connector", OpenAPIServiceConnector())
94+
pipe.add_component(
95+
"message_adapter",
96+
OutputAdapter(
97+
"{{system_message + service_response}}",
98+
list[ChatMessage],
99+
unsafe=True,
100+
),
101+
)
102+
pipe.add_component("llm", OpenAIChatGenerator())
103+
104+
pipe.connect("spec_to_functions.functions", "prepare_fc_adapter.functions")
105+
pipe.connect(
106+
"spec_to_functions.openapi_specs",
107+
"openapi_connector.service_openapi_spec",
108+
)
109+
pipe.connect("prepare_fc_adapter", "functions_llm.generation_kwargs")
110+
pipe.connect("functions_llm.replies", "openapi_connector.messages")
111+
pipe.connect("openapi_connector.service_response", "message_adapter.service_response")
112+
pipe.connect("message_adapter", "llm.messages")
113+
114+
result = pipe.run(
115+
data={
116+
"functions_llm": {
117+
"messages": [
118+
ChatMessage.from_system("Only do tool/function calling"),
119+
ChatMessage.from_user(user_prompt),
120+
],
121+
},
122+
"openapi_connector": {
123+
"service_credentials": serper_dev_key,
124+
},
125+
"spec_to_functions": {
126+
"sources": [ByteStream.from_string(json.dumps(serperdev_spec))],
127+
},
128+
"message_adapter": {
129+
"system_message": [ChatMessage.from_system(system_prompt)],
130+
},
131+
},
132+
)
133+
134+
print(result["llm"]["replies"][0].text)
135+
136+
# Sam Altman was ousted from OpenAI on November 17, 2023, following
137+
# a "deliberative review process" by the board of directors. The board concluded
138+
# that he was not "consistently candid in his communications". However, he
139+
# returned as CEO just days after his ouster.
111140
```

0 commit comments

Comments
 (0)