Skip to content

Commit 0067644

Browse files
authored
Merge branch 'main' into github_repo_forker_tool_integration
2 parents cd5ff1d + 64b7d10 commit 0067644

23 files changed

Lines changed: 520 additions & 388 deletions

File tree

.github/workflows/ollama.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ concurrency:
2222
env:
2323
PYTHONUNBUFFERED: "1"
2424
FORCE_COLOR: "1"
25-
LLM_FOR_TESTS: "llama3.2:3b"
25+
LLM_FOR_TESTS: "qwen3:0.6b"
2626
EMBEDDER_FOR_TESTS: "nomic-embed-text"
2727

2828
jobs:

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,8 +37,8 @@ Please check out our [Contribution Guidelines](CONTRIBUTING.md) for all the deta
3737
| [fastembed-haystack](integrations/fastembed/) | Embedder, Ranker | [![PyPI - Version](https://img.shields.io/pypi/v/fastembed-haystack.svg)](https://pypi.org/project/fastembed-haystack/) | [![Test / fastembed](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/fastembed.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/fastembed.yml) |
3838
| [github-haystack](integrations/github/) | Connector | [![PyPI - Version](https://img.shields.io/pypi/v/github-haystack.svg)](https://pypi.org/project/github-haystack) | [![Test / github](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/github.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/github.yml) |
3939
| [google-ai-haystack](integrations/google_ai/) | Generator | [![PyPI - Version](https://img.shields.io/pypi/v/google-ai-haystack.svg)](https://pypi.org/project/google-ai-haystack) | [![Test / google-ai](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_ai.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_ai.yml) |
40-
| [google-genai-haystack](integrations/google_genai/) | Generator | [![PyPI - Version](https://img.shields.io/pypi/v/google-genai-haystack.svg)](https://pypi.org/project/google-genai-haystack) | [![Test / google-genai](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_genai.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_genai.yml) |
41-
| [google-vertex-haystack](integrations/google_vertex/) | Generator | [![PyPI - Version](https://img.shields.io/pypi/v/google-vertex-haystack.svg)](https://pypi.org/project/google-vertex-haystack) | [![Test / google-vertex](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_vertex.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_vertex.yml) |
40+
| [google-genai-haystack](integrations/google_genai/) | Embedder, Generator | [![PyPI - Version](https://img.shields.io/pypi/v/google-genai-haystack.svg)](https://pypi.org/project/google-genai-haystack) | [![Test / google-genai](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_genai.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_genai.yml) |
41+
| [google-vertex-haystack](integrations/google_vertex/) | Embedder, Generator | [![PyPI - Version](https://img.shields.io/pypi/v/google-vertex-haystack.svg)](https://pypi.org/project/google-vertex-haystack) | [![Test / google-vertex](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_vertex.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/google_vertex.yml) |
4242
| [instructor-embedders-haystack](integrations/instructor_embedders/) | Embedder | [![PyPI - Version](https://img.shields.io/pypi/v/instructor-embedders-haystack.svg)](https://pypi.org/project/instructor-embedders-haystack) | [![Test / instructor-embedders](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/instructor_embedders.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/instructor_embedders.yml) |
4343
| [jina-haystack](integrations/jina/) | Connector, Embedder, Ranker | [![PyPI - Version](https://img.shields.io/pypi/v/jina-haystack.svg)](https://pypi.org/project/jina-haystack) | [![Test / jina](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/jina.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/jina.yml) |
4444
| [langfuse-haystack](integrations/langfuse/) | Tracer | [![PyPI - Version](https://img.shields.io/pypi/v/langfuse-haystack.svg?color=orange)](https://pypi.org/project/langfuse-haystack) | [![Test / langfuse](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/langfuse.yml/badge.svg)](https://github.com/deepset-ai/haystack-core-integrations/actions/workflows/langfuse.yml) |

integrations/amazon_bedrock/CHANGELOG.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,17 @@
11
# Changelog
22

3+
## [integrations/amazon_bedrock-v3.8.0] - 2025-07-04
4+
5+
### 🚀 Features
6+
7+
- Pass component_info to StreamingChunk in AmazonBedrockChatGenerator (#2042)
8+
9+
### 🧹 Chores
10+
11+
- Remove black (#1985)
12+
- Improve typing for select_streaming_callback (#2008)
13+
14+
315
## [integrations/amazon_bedrock-v3.7.0] - 2025-06-11
416

517
### 🐛 Bug Fixes

integrations/amazon_bedrock/pyproject.toml

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ classifiers = [
2323
"Programming Language :: Python :: Implementation :: CPython",
2424
"Programming Language :: Python :: Implementation :: PyPy",
2525
]
26-
dependencies = ["haystack-ai>=2.13.1", "boto3>=1.28.57", "aioboto3>=14.0.0"]
26+
dependencies = ["haystack-ai>=2.15.1", "boto3>=1.28.57", "aioboto3>=14.0.0"]
2727

2828
[project.urls]
2929
Documentation = "https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/amazon_bedrock#readme"
@@ -163,9 +163,6 @@ exclude_lines = ["no cov", "if __name__ == .__main__.:", "if TYPE_CHECKING:"]
163163
[tool.pytest.ini_options]
164164
addopts = "--strict-markers"
165165
markers = [
166-
"unit: unit tests",
167166
"integration: integration tests",
168-
"embedders: embedders tests",
169-
"generators: generators tests",
170167
]
171168
log_cli = true

integrations/amazon_bedrock/src/haystack_integrations/components/generators/amazon_bedrock/chat/chat_generator.py

Lines changed: 17 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
from botocore.eventstream import EventStream
66
from botocore.exceptions import ClientError
77
from haystack import component, default_from_dict, default_to_dict, logging
8-
from haystack.dataclasses import ChatMessage, StreamingCallbackT, select_streaming_callback
8+
from haystack.dataclasses import ChatMessage, ComponentInfo, StreamingCallbackT, select_streaming_callback
99
from haystack.tools import (
1010
Tool,
1111
Toolset,
@@ -408,6 +408,8 @@ def run(
408408
:raises AmazonBedrockInferenceError:
409409
If the Bedrock inference API call fails.
410410
"""
411+
component_info = ComponentInfo.from_component(self)
412+
411413
params, callback = self._prepare_request_params(
412414
messages=messages,
413415
streaming_callback=streaming_callback,
@@ -424,7 +426,12 @@ def run(
424426
msg = "No stream found in the response."
425427
raise AmazonBedrockInferenceError(msg)
426428
# the type of streaming callback is checked in _prepare_request_params, but mypy doesn't know
427-
replies = _parse_streaming_response(response_stream, callback, self.model) # type: ignore[arg-type]
429+
replies = _parse_streaming_response(
430+
response_stream=response_stream,
431+
streaming_callback=callback, # type: ignore[arg-type]
432+
model=self.model,
433+
component_info=component_info,
434+
)
428435
else:
429436
response = self.client.converse(**params)
430437
replies = _parse_completion_response(response, self.model)
@@ -461,6 +468,8 @@ async def run_async(
461468
:raises AmazonBedrockInferenceError:
462469
If the Bedrock inference API call fails.
463470
"""
471+
component_info = ComponentInfo.from_component(self)
472+
464473
params, callback = self._prepare_request_params(
465474
messages=messages,
466475
streaming_callback=streaming_callback,
@@ -481,7 +490,12 @@ async def run_async(
481490
msg = "No stream found in the response."
482491
raise AmazonBedrockInferenceError(msg)
483492
# the type of streaming callback is checked in _prepare_request_params, but mypy doesn't know
484-
replies = await _parse_streaming_response_async(response_stream, callback, self.model) # type: ignore[arg-type]
493+
replies = await _parse_streaming_response_async(
494+
response_stream=response_stream,
495+
streaming_callback=callback, # type: ignore[arg-type]
496+
model=self.model,
497+
component_info=component_info,
498+
)
485499
else:
486500
response = await async_client.converse(**params)
487501
replies = _parse_completion_response(response, self.model)

integrations/amazon_bedrock/src/haystack_integrations/components/generators/amazon_bedrock/chat/utils.py

Lines changed: 13 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@
88
AsyncStreamingCallbackT,
99
ChatMessage,
1010
ChatRole,
11+
ComponentInfo,
1112
StreamingChunk,
1213
SyncStreamingCallbackT,
1314
ToolCall,
@@ -235,7 +236,9 @@ def _parse_completion_response(response_body: Dict[str, Any], model: str) -> Lis
235236

236237

237238
# Bedrock streaming to Haystack util methods
238-
def _convert_event_to_streaming_chunk(event: Dict[str, Any], model: str) -> StreamingChunk:
239+
def _convert_event_to_streaming_chunk(
240+
event: Dict[str, Any], model: str, component_info: ComponentInfo
241+
) -> StreamingChunk:
239242
"""
240243
Convert a Bedrock streaming event to a Haystack StreamingChunk.
241244
@@ -244,6 +247,7 @@ def _convert_event_to_streaming_chunk(event: Dict[str, Any], model: str) -> Stre
244247
245248
:param event: Dictionary containing a Bedrock streaming event.
246249
:param model: The model ID used for generation, included in chunk metadata.
250+
:param component_info: ComponentInfo object
247251
:returns: StreamingChunk object containing the content and metadata extracted from the event.
248252
"""
249253
# Initialize an empty StreamingChunk to return if no relevant event is found
@@ -358,6 +362,8 @@ def _convert_event_to_streaming_chunk(event: Dict[str, Any], model: str) -> Stre
358362
},
359363
)
360364

365+
streaming_chunk.component_info = component_info
366+
361367
return streaming_chunk
362368

363369

@@ -438,18 +444,20 @@ def _parse_streaming_response(
438444
response_stream: EventStream,
439445
streaming_callback: SyncStreamingCallbackT,
440446
model: str,
447+
component_info: ComponentInfo,
441448
) -> List[ChatMessage]:
442449
"""
443450
Parse a streaming response from Bedrock.
444451
445452
:param response_stream: EventStream from Bedrock API
446453
:param streaming_callback: Callback for streaming chunks
447454
:param model: The model ID used for generation
455+
:param component_info: ComponentInfo object
448456
:return: List of ChatMessage objects
449457
"""
450458
chunks: List[StreamingChunk] = []
451459
for event in response_stream:
452-
streaming_chunk = _convert_event_to_streaming_chunk(event=event, model=model)
460+
streaming_chunk = _convert_event_to_streaming_chunk(event=event, model=model, component_info=component_info)
453461
streaming_callback(streaming_chunk)
454462
chunks.append(streaming_chunk)
455463
replies = [_convert_streaming_chunks_to_chat_message(chunks=chunks)]
@@ -460,18 +468,20 @@ async def _parse_streaming_response_async(
460468
response_stream: EventStream,
461469
streaming_callback: AsyncStreamingCallbackT,
462470
model: str,
471+
component_info: ComponentInfo,
463472
) -> List[ChatMessage]:
464473
"""
465474
Parse a streaming response from Bedrock.
466475
467476
:param response_stream: EventStream from Bedrock API
468477
:param streaming_callback: Callback for streaming chunks
469478
:param model: The model ID used for generation
479+
:param component_info: ComponentInfo object
470480
:return: List of ChatMessage objects
471481
"""
472482
chunks: List[StreamingChunk] = []
473483
async for event in response_stream:
474-
streaming_chunk = _convert_event_to_streaming_chunk(event=event, model=model)
484+
streaming_chunk = _convert_event_to_streaming_chunk(event=event, model=model, component_info=component_info)
475485
await streaming_callback(streaming_chunk)
476486
chunks.append(streaming_chunk)
477487
replies = [_convert_streaming_chunks_to_chat_message(chunks=chunks)]

integrations/amazon_bedrock/tests/test_chat_generator.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -298,6 +298,7 @@ def streaming_callback(chunk: StreamingChunk):
298298
streaming_callback_called = True
299299
assert isinstance(chunk, StreamingChunk)
300300
assert chunk.content is not None
301+
assert chunk.component_info is not None
301302
if not paris_found_in_response:
302303
paris_found_in_response = "paris" in chunk.content.lower()
303304

integrations/amazon_bedrock/tests/test_chat_generator_utils.py

Lines changed: 20 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
import pytest
2-
from haystack.dataclasses import ChatMessage, ChatRole, StreamingChunk, ToolCall
2+
from haystack.dataclasses import ChatMessage, ChatRole, ComponentInfo, StreamingChunk, ToolCall
33
from haystack.tools import Tool
44

55
from haystack_integrations.components.generators.amazon_bedrock.chat.utils import (
@@ -339,6 +339,9 @@ def test_process_streaming_response_one_tool_call(self, mock_boto3_session):
339339
Test that process_streaming_response correctly handles streaming events and accumulates responses
340340
"""
341341
model = "anthropic.claude-3-5-sonnet-20240620-v1:0"
342+
type_ = (
343+
"haystack_integrations.components.generators.amazon_bedrock.chat.chat_generator.AmazonBedrockChatGenerator"
344+
)
342345
streaming_chunks = []
343346

344347
def test_callback(chunk: StreamingChunk):
@@ -379,7 +382,11 @@ def test_callback(chunk: StreamingChunk):
379382
},
380383
]
381384

382-
replies = _parse_streaming_response(events, test_callback, model)
385+
component_info = ComponentInfo(
386+
type=type_,
387+
)
388+
389+
replies = _parse_streaming_response(events, test_callback, model, component_info)
383390
# Pop completion_start_time since it will always change
384391
replies[0].meta.pop("completion_start_time")
385392
expected_messages = [
@@ -413,13 +420,19 @@ def test_callback(chunk: StreamingChunk):
413420
"type": "function",
414421
}
415422
]
423+
for chunk in streaming_chunks:
424+
assert chunk.component_info.type == type_
425+
assert chunk.component_info.name is None # not in a pipeline
416426

417427
# Verify final replies
418428
assert len(replies) == 1
419429
assert replies == expected_messages
420430

421431
def test_parse_streaming_response_with_two_tool_calls(self, mock_boto3_session):
422432
model = "anthropic.claude-3-5-sonnet-20240620-v1:0"
433+
type_ = (
434+
"haystack_integrations.components.generators.amazon_bedrock.chat.chat_generator.AmazonBedrockChatGenerator"
435+
)
423436
streaming_chunks = []
424437

425438
def test_callback(chunk: StreamingChunk):
@@ -468,7 +481,11 @@ def test_callback(chunk: StreamingChunk):
468481
},
469482
]
470483

471-
replies = _parse_streaming_response(events, test_callback, model)
484+
component_info = ComponentInfo(
485+
type=type_,
486+
)
487+
488+
replies = _parse_streaming_response(events, test_callback, model, component_info)
472489
# Pop completion_start_time since it will always change
473490
replies[0].meta.pop("completion_start_time")
474491
expected_messages = [

integrations/anthropic/CHANGELOG.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,12 @@
11
# Changelog
22

3+
## [integrations/anthropic-v3.1.0] - 2025-07-04
4+
5+
### 🚀 Features
6+
7+
- Pass `component_info`to `StreamingChunk` in `AnthropicChatGenerator` (#2056)
8+
9+
310
## [integrations/anthropic-v3.0.0] - 2025-06-30
411

512
### 🚀 Features

integrations/anthropic/pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ classifiers = [
2323
"Programming Language :: Python :: Implementation :: CPython",
2424
"Programming Language :: Python :: Implementation :: PyPy",
2525
]
26-
dependencies = ["haystack-ai>=2.13.1", "anthropic>=0.47.0"]
26+
dependencies = ["haystack-ai>=2.15.1", "anthropic>=0.47.0"]
2727

2828
[project.urls]
2929
Documentation = "https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/anthropic#readme"

0 commit comments

Comments
 (0)