-
Notifications
You must be signed in to change notification settings - Fork 361
Centralize 'trtexec' subprocess runs in ONNX into a single function #1268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -44,6 +44,15 @@ def _check_lib_in_ld_library_path(ld_library_path, lib_pattern): | |
| return False, None | ||
|
|
||
|
|
||
| def _run_trtexec(cmd, timeout=None): | ||
| """Run a 'trtexec' command via subprocess.""" | ||
| # Ensure that this command is a trtexec run | ||
| assert any("trtexec" in c for c in cmd), "Subprocess can only execute 'trtexec' commands" | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @kevalmorabia97 please let me know if this is enough to ensure only
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we not pass Please add type annotation also |
||
|
|
||
| # Run trtexec command | ||
| return subprocess.run(cmd, capture_output=True, text=True, timeout=timeout) # nosec B603 | ||
|
Comment on lines
+47
to
+53
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🧩 Analysis chain🏁 Script executed: #!/bin/bash
# Verify the problematic patterns exist at the expected locations.
rg -n 'def _run_trtexec|assert any\("trtexec"|# nosec B603|subprocess\.run\(' modelopt/onnx/quantization/ort_utils.py -C2Repository: NVIDIA/Model-Optimizer Length of output: 452 🏁 Script executed: # Check for calls to _run_trtexec to understand usage context
rg -n '_run_trtexec' modelopt/onnx/quantization/ort_utils.py -A 5 -B 5Repository: NVIDIA/Model-Optimizer Length of output: 927 🏁 Script executed: # Check if there are any imports of subprocess and how it's being used elsewhere
rg -n 'import subprocess|from subprocess' modelopt/onnx/quantization/ort_utils.py -B 2 -A 2Repository: NVIDIA/Model-Optimizer Length of output: 184 🏁 Script executed: # Verify the file structure and get full context around the function
sed -n '1,60p' modelopt/onnx/quantization/ort_utils.pyRepository: NVIDIA/Model-Optimizer Length of output: 2324 Replace the Line 50 uses Proposed fix-def _run_trtexec(cmd, timeout=None):
+def _run_trtexec(cmd: Sequence[str], timeout: float | None = None):
"""Run a 'trtexec' command via subprocess."""
- # Ensure that this command is a trtexec run
- assert any("trtexec" in c for c in cmd), "Subprocess can only execute 'trtexec' commands"
+ if not cmd:
+ raise ValueError("Empty command passed to _run_trtexec")
+ exe = os.path.basename(str(cmd[0])).lower()
+ if exe not in {"trtexec", "trtexec.exe"}:
+ raise ValueError("Subprocess can only execute 'trtexec' commands")
# Run trtexec command
- return subprocess.run(cmd, capture_output=True, text=True, timeout=timeout) # nosec B603
+ return subprocess.run(list(cmd), capture_output=True, text=True, timeout=timeout, check=False)🤖 Prompt for AI Agents |
||
|
|
||
|
|
||
| def _check_for_trtexec(min_version: str = "10.0") -> str: | ||
| """Check if the `trtexec` CLI tool is available in PATH and is >= min_version. | ||
|
|
||
|
|
@@ -87,7 +96,7 @@ def _parse_version_from_string(version_str: str) -> str | None: | |
| ) | ||
|
|
||
| try: | ||
| result = subprocess.run([trtexec_path], capture_output=True, text=True, timeout=5) # nosec B603 | ||
| result = _run_trtexec([trtexec_path], timeout=5) | ||
| banner_output = result.stdout + result.stderr | ||
| parsed_version = _parse_version_from_string(banner_output) | ||
|
|
||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you unify with
Model-Optimizer/modelopt/torch/_deploy/_runtime/tensorrt/engine_builder.py
Line 62 in 07ae8e7
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was trying to avoid coupling
modelopt/torchwithmodelopt/onnx. What do you think is best?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
modelopt.torch._deployalready has some dependencies onmodelopt.onnxso its fine. We can keep this inmodelopt.onnx.trt_utilsor whatever names makes senseFor example this: https://github.com/NVIDIA/Model-Optimizer/blob/07ae8e71281a1c17898991b0cd76db5788f490bb/modelopt/torch/_deploy/_runtime/trt_client.py
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point, will look into this today.