Skip to content

Commit d6cea5c

Browse files
Merge branch 'master' into fix_hypertuner_managed_spot_350
2 parents 0459ad5 + 387213b commit d6cea5c

File tree

98 files changed

+5727
-2121
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

98 files changed

+5727
-2121
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,3 +40,4 @@ env/
4040
sagemaker_train/src/**/container_drivers/sm_train.sh
4141
sagemaker_train/src/**/container_drivers/sourcecode.json
4242
sagemaker_train/src/**/container_drivers/distributed.json
43+
.kiro

CHANGELOG.md

Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,42 @@
11
# Changelog
2+
## v3.7.1 (2026-03-31)
3+
4+
### Features
5+
- **Telemetry**: Added telemetry emitter to `ScriptProcessor` and `FrameworkProcessor`, enabling SDK usage tracking for processing jobs via the telemetry attribution module (new `PROCESSING` feature enum added to telemetry constants)
6+
7+
### Fixes
8+
- **ModelBuilder**: Fixed `accept_eula` handling in ModelBuilder's LoRA deployment path — previously hardcoded to `True`, now respects the user-provided value and raises a `ValueError` if not explicitly set to `True`
9+
- **Evaluate**: Fixed Lambda handler name derivation in the Evaluator — hardcoded the handler to `lambda_function.lambda_handler` instead of deriving it from the source filename, which caused invocation failures when the source file had a non-default name
10+
11+
## v3.7.0 (2026-03-25)
12+
13+
### Fixes
14+
- **ModelBuilder**: Sync Nova hosting configs with AGISageMakerInference (#5664)
15+
- **Evaluate**: Remove GPT OSS model evaluation restriction (#5658)
16+
17+
### Features
18+
- **AWS Batch**: Add support for Quota Management job submission and job priority update (#5659)
19+
- **AWS Batch**: Extend list_jobs_by_share for quota_share_name (#5669)
20+
- **Evaluate**: Support IAM role for BaseEvaluator (#5671)
21+
- **Telemetry**: Add telemetry attribution module for SDK usage provenance (#5661)
22+
- **MLflow**: Metrics visualization, enhanced wait UI, and eval job links (#5662)
23+
24+
### Chores
25+
- Updated SDK to use latest LMIv22 image for v3.x (#5640)
26+
- Migration guide update (#5655)
27+
- AWS Batch integ test resources are now uniquely named by test run (#5666)
28+
29+
## v3.6.0 (2026-03-19)
30+
31+
### Fixes
32+
- **HyperparameterTuner**: Include sm_drivers channel in HyperparameterTuner jobs (#5516)
33+
- **Pipeline**: Fix handling of training step dependencies to allow successful pipeline creation
34+
- **ModelBuilder**: Fix the bug in deploy from LORA finetuning job
35+
36+
### Features
37+
- **Feature Processor**: Port feature processor to v3
38+
- **Jumpstart**: Add EUSC region config for JumpStart
39+
240
## v3.5.0 (2026-03-02)
341

442
### Features

VERSION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
3.5.0
1+
3.7.1

migration.md

Lines changed: 20 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@
2525

2626
## Migration Tool (MCP Server)
2727

28-
An AI-powered migration tool is available as an MCP server that can analyze your V2 code, transform it to V3, validate the results, and answer migration questions interactively through your IDE.
28+
This AI-powered migration tool is available as an MCP server to analyze your V2 code, transform it to V3, validate the results, and answer migration questions interactively through your IDE.
2929

3030
### Installation
3131

@@ -61,32 +61,22 @@ Add the following to your MCP configuration file:
6161
{
6262
"mcpServers": {
6363
"sagemaker-sdk-helper": {
64-
"command": "sagemaker-sdk-helper",
65-
"args": ["--log-level", "INFO"]
64+
"command": "/path/to/installation",
65+
"args": ["--log-level", "INFO"],
66+
"autoApprove": [
67+
"ask_question",
68+
"transform_code",
69+
"validate_code",
70+
"analyze_code"
71+
]
72+
}
6673
}
67-
}
6874
}
75+
6976
```
7077

7178
> **Note**: If you installed in a virtual environment, use the full path to the executable (find it with `which sagemaker-sdk-helper`).
7279
73-
**With SDK source artifacts (recommended, 20-30% better accuracy):**
74-
75-
```json
76-
{
77-
"mcpServers": {
78-
"sagemaker-sdk-helper": {
79-
"command": "/path/to/.venv/bin/sagemaker-sdk-helper",
80-
"args": [
81-
"--log-level", "INFO",
82-
"--v2-artifacts", "/path/to/sdk_v2/sagemaker-python-sdk",
83-
"--v3-artifacts", "/path/to/sdk_v3/sagemaker-python-sdk"
84-
]
85-
}
86-
}
87-
}
88-
```
89-
9080
After updating the config, restart your IDE or reconnect MCP servers (in Kiro: Command Palette → "MCP: Reconnect Servers").
9181

9282
### Kiro CLI
@@ -97,10 +87,16 @@ For Kiro CLI, add the same configuration to `~/.kiro/settings/mcp.json`:
9787
{
9888
"mcpServers": {
9989
"sagemaker-sdk-helper": {
100-
"command": "sagemaker-sdk-helper",
101-
"args": ["--log-level", "INFO"]
90+
"command": "/path/to/installation",
91+
"args": ["--log-level", "INFO"],
92+
"autoApprove": [
93+
"ask_question",
94+
"transform_code",
95+
"validate_code",
96+
"analyze_code"
97+
]
98+
}
10299
}
103-
}
104100
}
105101
```
106102

pyproject.toml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -33,10 +33,10 @@ classifiers = [
3333
"Programming Language :: Python :: 3.12",
3434
]
3535
dependencies = [
36-
"sagemaker-core>=2.5.0,<3.0.0",
37-
"sagemaker-train>=1.5.0,<2.0.0",
38-
"sagemaker-serve>=1.5.0,<2.0.0",
39-
"sagemaker-mlops>=1.5.0,<2.0.0",
36+
"sagemaker-core>=2.7.1,<3.0.0",
37+
"sagemaker-train>=1.7.1,<2.0.0",
38+
"sagemaker-serve>=1.7.1,<2.0.0",
39+
"sagemaker-mlops>=1.7.1,<2.0.0",
4040
]
4141

4242
[project.optional-dependencies]

sagemaker-core/CHANGELOG.md

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,30 @@
11
# Changelog
2+
## v2.7.1 (2026-03-31)
3+
4+
### Features
5+
6+
- **Telemetry**: Added telemetry emitter to `ScriptProcessor` and `FrameworkProcessor`, enabling SDK usage tracking for processing jobs via the telemetry attribution module (new `PROCESSING` feature enum added to telemetry constants)
7+
8+
### Bug Fixes
9+
10+
- **ModelBuilder**: Fixed `accept_eula` handling in ModelBuilder's LoRA deployment path — previously hardcoded to `True`, now respects the user-provided value and raises a `ValueError` if not explicitly set to `True`
11+
- **Evaluate**: Fixed Lambda handler name derivation in the Evaluator — hardcoded the handler to `lambda_function.lambda_handler` instead of deriving it from the source filename, which caused invocation failures when the source file had a non-default name
12+
13+
## v2.7.0 (2026-03-25)
14+
15+
### Bug fixes and Other Changes
16+
17+
- **Enhancement**: Add telemetry attribution module for SDK usage provenance (#5661)
18+
- **Enhancement**: Updated SDK to use latest LMIv22 image for v3.x (#5640)
19+
- **Enhancement**: Resources codegen update for eval job links (#5662)
20+
21+
## v2.6.0 (2026-03-19)
22+
23+
### Bug fixes and Other Changes
24+
25+
- **Fix**: resolve PermissionError during local mode cleanup of root-owned Docker files (#5629)
26+
- **Enhancement**: Add EUSC region config for JumpStart
27+
228
## v2.5.1 (2026-03-12)
329

430
### Bug Fixes

sagemaker-core/VERSION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
2.5.1
1+
2.7.1

sagemaker-core/src/sagemaker/core/__init__.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,5 +15,8 @@
1515
# Partner App
1616
from sagemaker.core.partner_app.auth_provider import PartnerAppAuthProvider # noqa: F401
1717

18+
# Attribution
19+
from sagemaker.core.telemetry.attribution import Attribution, set_attribution # noqa: F401
20+
1821
# Note: HyperparameterTuner and WarmStartTypes are in sagemaker.train.tuner
1922
# They are not re-exported from core to avoid circular dependencies

sagemaker-core/src/sagemaker/core/image_uri_config/djl-lmi.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@
4646
"mx-central-1": "637423239942"
4747
},
4848
"repository": "djl-inference",
49-
"tag_prefix": "0.36.0-lmi21.0.0-cu129"
49+
"tag_prefix": "0.36.0-lmi22.0.0-cu129"
5050
},
5151
"0.35.0": {
5252
"registries": {

sagemaker-core/src/sagemaker/core/processing.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,8 @@
7777
from sagemaker.core.workflow.execution_variables import ExecutionVariables
7878
from sagemaker.core.workflow.functions import Join
7979
from sagemaker.core.workflow.pipeline_context import runnable_by_pipeline
80+
from sagemaker.core.telemetry.telemetry_logging import _telemetry_emitter
81+
from sagemaker.core.telemetry.constants import Feature
8082

8183
from sagemaker.core._studio import _append_project_tags
8284
from sagemaker.core.config.config_utils import _append_sagemaker_config_tags
@@ -771,6 +773,7 @@ def __init__(
771773
network_config=network_config,
772774
)
773775

776+
@_telemetry_emitter(feature=Feature.PROCESSING, func_name="ScriptProcessor.run")
774777
@runnable_by_pipeline
775778
def run(
776779
self,
@@ -1171,6 +1174,7 @@ def _package_code(
11711174
os.unlink(tmp.name)
11721175
return s3_uri
11731176

1177+
@_telemetry_emitter(feature=Feature.PROCESSING, func_name="FrameworkProcessor.run")
11741178
@runnable_by_pipeline
11751179
def run(
11761180
self,

0 commit comments

Comments
 (0)