Skip to content

Commit d5eed80

Browse files
author
Pravali Uppugunduri
committed
fix: Remove hardcoded secret key from Triton ONNX export path
The ONNX export path in _prepare_for_triton() set self.secret_key to a hardcoded value 'dummy secret key for onnx backend'. This key was then passed as SAGEMAKER_SERVE_SECRET_KEY into container environment variables and exposed in plaintext via DescribeModel/DescribeEndpointConfig APIs. The ONNX path does not use pickle serialization — models are exported to .onnx format and loaded natively by Triton's ONNX Runtime backend. There is no serve.pkl, no metadata.json, and no integrity check to perform. The secret key was dead code that also constituted a hardcoded credential (CWE-798). With this change, self.secret_key remains empty string (set by _build_for_triton), and the existing cleanup in _build_for_transformers removes empty SAGEMAKER_SERVE_SECRET_KEY from env_vars before CreateModel. Addresses: P400136088 (Bug 2 - Hardcoded secret key)
1 parent f8df0a7 commit d5eed80

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

sagemaker-serve/src/sagemaker/serve/model_builder_utils.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3075,7 +3075,8 @@ def _prepare_for_triton(self):
30753075
export_path.mkdir(parents=True)
30763076

30773077
if self.model:
3078-
self.secret_key = "dummy secret key for onnx backend"
3078+
# ONNX path: no pickle serialization, no serve.pkl, no integrity check needed.
3079+
# Do not set secret_key — there is nothing to sign.
30793080

30803081
if self.framework == Framework.PYTORCH:
30813082
self._export_pytorch_to_onnx(

0 commit comments

Comments
 (0)