Skip to content

Commit 5dbbb52

Browse files
Merge branch 'release-please--branches--main--changes--next' of github.com:openlayer-ai/openlayer-python into release-please--branches--main--changes--next
2 parents 9e8ffab + 4592645 commit 5dbbb52

File tree

4 files changed

+82
-16
lines changed

4 files changed

+82
-16
lines changed

CHANGELOG.md

Lines changed: 23 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -5,34 +5,42 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
66
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).
77

8-
## 0.12.0 (2025-12-17)
8+
## 0.12.0 (2025-12-26)
99

1010
Full Changelog: [v0.11.4...v0.12.0](https://github.com/openlayer-ai/openlayer-python/compare/v0.11.4...v0.12.0)
1111

1212
### Features
1313

14-
* **api:** api update ([f53fab3](https://github.com/openlayer-ai/openlayer-python/commit/f53fab30c1d6404e144e2c862a66f9e08086480a))
15-
* **api:** api update ([7a554fd](https://github.com/openlayer-ai/openlayer-python/commit/7a554fd1301aba3f8c58039780414b86f835d06a))
14+
* **api:** api update ([f117723](https://github.com/openlayer-ai/openlayer-python/commit/f11772338f2da872ca49f0a658c45109b9507800))
15+
* **api:** api update ([9f22b46](https://github.com/openlayer-ai/openlayer-python/commit/9f22b4660e4b37a786a6ff3999847936577c8a23))
16+
* enhance LiteLLM metadata extraction by adding cost retrieval from response headers ([43a7d2e](https://github.com/openlayer-ai/openlayer-python/commit/43a7d2e44b788586a3e4e3e484e9a7967f3631af))
17+
* prevent multiple patching in trace_litellm function to avoid duplicate traces ([#568](https://github.com/openlayer-ai/openlayer-python/issues/568)) ([a1619e7](https://github.com/openlayer-ai/openlayer-python/commit/a1619e793878720b5719ca7e2a580e9990b88f96))
1618

1719

1820
### Bug Fixes
1921

20-
* **client:** close streams without requiring full consumption ([178a48f](https://github.com/openlayer-ai/openlayer-python/commit/178a48fc44b1bc35d11ffb72f81e5b6a2c761c89))
21-
* compat with Python 3.14 ([9a18d6e](https://github.com/openlayer-ai/openlayer-python/commit/9a18d6ee1df7d00210243b81643d05a509d94a96))
22+
* **client:** close streams without requiring full consumption ([c1458b7](https://github.com/openlayer-ai/openlayer-python/commit/c1458b7487946db5d4a2037852ee0acd31cc5672))
23+
* compat with Python 3.14 ([c2d9681](https://github.com/openlayer-ai/openlayer-python/commit/c2d9681e556118e9f6f91de46f1662b7a794ee2c))
2224

2325

2426
### Chores
2527

26-
* bump `httpx-aiohttp` version to 0.1.9 ([a8d04ae](https://github.com/openlayer-ai/openlayer-python/commit/a8d04ae6242bdaad9651ffbaa537b217f4eeaba8))
27-
* do not install brew dependencies in ./scripts/bootstrap by default ([053b9e4](https://github.com/openlayer-ai/openlayer-python/commit/053b9e409d5bb48aafb1641d292d4105eccd0e76))
28-
* **internal/tests:** avoid race condition with implicit client cleanup ([41ae1b6](https://github.com/openlayer-ai/openlayer-python/commit/41ae1b64f85a1ff8878e43bb872904b93acfd23d))
29-
* **internal:** detect missing future annotations with ruff ([b778406](https://github.com/openlayer-ai/openlayer-python/commit/b778406f38812d00cd770717fdf518725c935582))
30-
* **internal:** grammar fix (it's -> its) ([65edc47](https://github.com/openlayer-ai/openlayer-python/commit/65edc47c162f3e545c463f24ce5bb75769a69aae))
31-
* **internal:** update pydantic dependency ([9e14c8a](https://github.com/openlayer-ai/openlayer-python/commit/9e14c8a69c179bc43c417856455db3854065ccdf))
32-
* **internal:** version bump ([c10fa5d](https://github.com/openlayer-ai/openlayer-python/commit/c10fa5d7d549b2c23865e1c22ff08760ceab6324))
33-
* **internal:** version bump ([ff91ea8](https://github.com/openlayer-ai/openlayer-python/commit/ff91ea81a82a9ffb96946ecfe78c7177783dbaa7))
34-
* **package:** drop Python 3.8 support ([d3d0f8f](https://github.com/openlayer-ai/openlayer-python/commit/d3d0f8fd5d001bac3c6db1cb87a02030049c7fec))
35-
* **types:** change optional parameter type from NotGiven to Omit ([acd07c4](https://github.com/openlayer-ai/openlayer-python/commit/acd07c48b6414194749c13fbe79162bd438c31e2))
28+
* bump `httpx-aiohttp` version to 0.1.9 ([3f895de](https://github.com/openlayer-ai/openlayer-python/commit/3f895ded9e56ca11c4aafc093502726a14afedf6))
29+
* do not install brew dependencies in ./scripts/bootstrap by default ([45badc5](https://github.com/openlayer-ai/openlayer-python/commit/45badc57d4e30a195407e5b733a7327356685c69))
30+
* **internal/tests:** avoid race condition with implicit client cleanup ([3d05ccc](https://github.com/openlayer-ai/openlayer-python/commit/3d05ccc4ee73fe8f5c3de50d4a7dcbdcb3551674))
31+
* **internal:** detect missing future annotations with ruff ([d2887ba](https://github.com/openlayer-ai/openlayer-python/commit/d2887ba77f441518ca7e8ae1b690cee42794596a))
32+
* **internal:** grammar fix (it's -> its) ([4af20e1](https://github.com/openlayer-ai/openlayer-python/commit/4af20e1d560be01fb7eaa9dd050398fe23f1e4bb))
33+
* **internal:** update pydantic dependency ([0af11ac](https://github.com/openlayer-ai/openlayer-python/commit/0af11ac1a28176380247bf3e3b5db00fe6349593))
34+
* **internal:** version bump ([55f6ab0](https://github.com/openlayer-ai/openlayer-python/commit/55f6ab040221d31a154ef21c71a1957b77043f79))
35+
* **internal:** version bump ([3a5c286](https://github.com/openlayer-ai/openlayer-python/commit/3a5c2869c16c0f00e871042af29da29063a70c07))
36+
* **internal:** version bump ([d5dc8c0](https://github.com/openlayer-ai/openlayer-python/commit/d5dc8c0fd5b792e4a81f6445df0091f1cf9f8e36))
37+
* **internal:** version bump ([61324d6](https://github.com/openlayer-ai/openlayer-python/commit/61324d655921bbe613cd2542feba4a758d591293))
38+
* **internal:** version bump ([9db5997](https://github.com/openlayer-ai/openlayer-python/commit/9db5997e6d7648e00c37277ea87b2d5f7e79aa4f))
39+
* **internal:** version bump ([4659537](https://github.com/openlayer-ai/openlayer-python/commit/465953753d902fd2345cac72e0ed87131668b0f4))
40+
* **internal:** version bump ([199356c](https://github.com/openlayer-ai/openlayer-python/commit/199356ca1ce8eae57a6557c17d8c173486132fc2))
41+
* **internal:** version bump ([e63dee5](https://github.com/openlayer-ai/openlayer-python/commit/e63dee5ac27bfc396ec709188a54c3045e1e3698))
42+
* **package:** drop Python 3.8 support ([4c48617](https://github.com/openlayer-ai/openlayer-python/commit/4c48617ef7c34e0db601617a3b30d3bb540eeea5))
43+
* **types:** change optional parameter type from NotGiven to Omit ([54c1533](https://github.com/openlayer-ai/openlayer-python/commit/54c1533b6810fcf9323c23f39dc008405fd0a36e))
3644

3745
## 0.11.4 (2025-12-17)
3846

examples/tracing/openai/openai_agents_tracing.ipynb

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -177,6 +177,9 @@
177177
"metadata": {},
178178
"outputs": [],
179179
"source": [
180+
"from __future__ import annotations\n",
181+
"\n",
182+
"\n",
180183
"class AirlineAgentContext(BaseModel):\n",
181184
" \"\"\"Context model to maintain conversation state across agents.\"\"\"\n",
182185
" passenger_name: str | None = None\n",

examples/tracing/openai/openai_tracing.ipynb

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -121,6 +121,8 @@
121121
"metadata": {},
122122
"outputs": [],
123123
"source": [
124+
"from __future__ import annotations\n",
125+
"\n",
124126
"from pydantic import BaseModel\n",
125127
"\n",
126128
"\n",

src/openlayer/lib/integrations/litellm_tracer.py

Lines changed: 54 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,9 @@
1919

2020
logger = logging.getLogger(__name__)
2121

22+
# Flag to prevent multiple patching
23+
_litellm_traced = False
24+
2225

2326
def trace_litellm() -> None:
2427
"""Patch the litellm.completion function to trace completions.
@@ -57,11 +60,18 @@ def trace_litellm() -> None:
5760
... inference_id="custom-id-123" # Optional Openlayer parameter
5861
... )
5962
"""
63+
global _litellm_traced
64+
6065
if not HAVE_LITELLM:
6166
raise ImportError(
6267
"LiteLLM library is not installed. Please install it with: pip install litellm"
6368
)
6469

70+
# Prevent multiple patching - this avoids duplicate traces
71+
if _litellm_traced:
72+
logger.debug("trace_litellm() already called - skipping to prevent duplicate traces")
73+
return
74+
6575
original_completion = litellm.completion
6676

6777
@wraps(original_completion)
@@ -84,6 +94,8 @@ def traced_completion(*args, **kwargs):
8494
)
8595

8696
litellm.completion = traced_completion
97+
_litellm_traced = True
98+
logger.debug("litellm.completion has been patched for Openlayer tracing")
8799

88100

89101
def handle_streaming_completion(
@@ -661,6 +673,7 @@ def extract_usage_from_chunk(chunk: Any) -> Dict[str, Optional[int]]:
661673
def extract_litellm_metadata(response: Any, model_name: str) -> Dict[str, Any]:
662674
"""Extract LiteLLM-specific metadata from response."""
663675
metadata = {}
676+
response_headers = {}
664677

665678
try:
666679
# Extract hidden parameters
@@ -689,6 +702,7 @@ def extract_litellm_metadata(response: Any, model_name: str) -> Dict[str, Any]:
689702
if 'additional_headers' in hidden_params:
690703
headers = hidden_params['additional_headers']
691704
if headers:
705+
response_headers = headers
692706
metadata['response_headers'] = headers
693707

694708
# Extract system fingerprint if available
@@ -697,9 +711,48 @@ def extract_litellm_metadata(response: Any, model_name: str) -> Dict[str, Any]:
697711

698712
# Extract response headers if available
699713
if hasattr(response, '_response_headers'):
700-
metadata['response_headers'] = dict(response._response_headers)
714+
response_headers = dict(response._response_headers)
715+
metadata['response_headers'] = response_headers
716+
717+
# Fallback: Extract cost from x-litellm-response-cost header if cost is missing or zero
718+
if not metadata.get('cost') and response_headers:
719+
cost_from_header = _extract_cost_from_headers(response_headers)
720+
if cost_from_header is not None:
721+
metadata['cost'] = cost_from_header
701722

702723
except Exception as e:
703724
logger.debug("Error extracting LiteLLM metadata: %s", e)
704725

705726
return metadata
727+
728+
729+
def _extract_cost_from_headers(headers: Dict[str, Any]) -> Optional[float]:
730+
"""Extract cost from LiteLLM response headers."""
731+
try:
732+
# Try to get cost from x-litellm-response-cost header
733+
cost_str = headers.get('x-litellm-response-cost')
734+
if cost_str is not None:
735+
# Handle string values (headers are often strings)
736+
if isinstance(cost_str, str):
737+
cost = float(cost_str)
738+
else:
739+
cost = float(cost_str)
740+
741+
if cost > 0:
742+
return cost
743+
744+
# Fallback to x-litellm-response-cost-original if primary is zero/missing
745+
cost_original_str = headers.get('x-litellm-response-cost-original')
746+
if cost_original_str is not None:
747+
if isinstance(cost_original_str, str):
748+
cost = float(cost_original_str)
749+
else:
750+
cost = float(cost_original_str)
751+
752+
if cost > 0:
753+
return cost
754+
755+
except (ValueError, TypeError) as e:
756+
logger.debug("Error parsing cost from headers: %s", e)
757+
758+
return None

0 commit comments

Comments
 (0)