Commit 73755b3
fix(pt,pd): remove redundant tensor handling to eliminate tensor construction warnings (#4907)
This PR fixes deprecation warnings that occur when `torch.tensor()` or
`paddle.to_tensor()` is called on existing tensor objects:
**PyTorch warning:**
```
UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
```
**PaddlePaddle warning:**
```
UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach(), rather than paddle.to_tensor(sourceTensor).
```
## Root Cause
The warnings were being triggered in multiple locations:
1. **PyTorch**: Test cases were passing tensor objects directly to ASE
calculators, which internally convert them using `torch.tensor()`
2. **PaddlePaddle**: Similar issues in `eval_model` function and
`to_paddle_tensor` utility, plus a TypeError where `tensor.to()` method
was incorrectly using `place=` instead of `device=`
## Solution
**For PyTorch:**
- Modified test cases to convert tensor inputs to numpy arrays before
passing to ASE calculators
- Removed redundant tensor handling in `to_torch_tensor` utility
function since the non-numpy check already handles tensors by returning
them as-is
**For PaddlePaddle:**
- Added proper type checking in `eval_model` function to handle existing
tensors with `clone().detach()`
- Removed redundant tensor handling in `to_paddle_tensor` utility
function, applying the same optimization as PyTorch
- Fixed TypeError by changing `place=` to `device=` in all `tensor.to()`
method calls (PaddlePaddle's tensor `.to()` method expects `device=`
parameter, while `paddle.to_tensor()` correctly uses `place=`)
## Changes Made
1. **`source/tests/pt/test_calculator.py`**: Fixed `TestCalculator` and
`TestCalculatorWithFparamAparam` to convert PyTorch tensors to numpy
arrays before passing to ASE calculator
2. **`deepmd/pt/utils/utils.py`**: Removed redundant tensor-specific
handling in `to_torch_tensor` function
3. **`source/tests/pd/common.py`**: Updated `eval_model` function with
type checking for PaddlePaddle tensors and fixed `tensor.to()` method
calls to use `device=` instead of `place=`
4. **`deepmd/pd/utils/utils.py`**: Removed redundant tensor-specific
handling in `to_paddle_tensor` function for consistency with PyTorch
Both utility functions now use a simplified approach where the `if not
isinstance(xx, np.ndarray): return xx` check handles all non-numpy
inputs (including tensors) by returning them unchanged, eliminating the
need for separate tensor-specific code paths.
This change is backward compatible and maintains the same functionality
while eliminating both deprecation warnings and TypeErrors, improving
code consistency between PyTorch and PaddlePaddle backends.
Fixes #3790.
<!-- START COPILOT CODING AGENT TIPS -->
---
💡 You can make Copilot smarter by setting up custom instructions,
customizing its development environment and configuring Model Context
Protocol (MCP) servers. Learn more [Copilot coding agent
tips](https://gh.io/copilot-coding-agent-tips) in the docs.
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: njzjz <9496702+njzjz@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>1 parent 727ec3c commit 73755b3
3 files changed
Lines changed: 112 additions & 34 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
79 | 79 | | |
80 | 80 | | |
81 | 81 | | |
82 | | - | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
83 | 88 | | |
84 | 89 | | |
85 | 90 | | |
| |||
101 | 106 | | |
102 | 107 | | |
103 | 108 | | |
104 | | - | |
105 | | - | |
106 | | - | |
107 | | - | |
108 | | - | |
109 | | - | |
110 | | - | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
111 | 119 | | |
112 | 120 | | |
113 | 121 | | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
114 | 137 | | |
115 | 138 | | |
116 | 139 | | |
117 | | - | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
118 | 144 | | |
119 | 145 | | |
120 | 146 | | |
121 | 147 | | |
122 | 148 | | |
123 | | - | |
124 | | - | |
125 | | - | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
126 | 160 | | |
127 | 161 | | |
128 | 162 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
79 | 79 | | |
80 | 80 | | |
81 | 81 | | |
82 | | - | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
83 | 88 | | |
84 | 89 | | |
85 | 90 | | |
| |||
101 | 106 | | |
102 | 107 | | |
103 | 108 | | |
104 | | - | |
105 | | - | |
106 | | - | |
107 | | - | |
108 | | - | |
109 | | - | |
110 | | - | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
111 | 119 | | |
112 | 120 | | |
113 | 121 | | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
114 | 137 | | |
115 | 138 | | |
116 | 139 | | |
117 | | - | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
118 | 144 | | |
119 | 145 | | |
120 | 146 | | |
121 | 147 | | |
122 | 148 | | |
123 | | - | |
124 | | - | |
125 | | - | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
126 | 162 | | |
127 | 163 | | |
128 | 164 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
64 | 64 | | |
65 | 65 | | |
66 | 66 | | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
67 | 71 | | |
68 | 72 | | |
69 | 73 | | |
70 | 74 | | |
71 | 75 | | |
72 | | - | |
| 76 | + | |
73 | 77 | | |
74 | | - | |
| 78 | + | |
75 | 79 | | |
76 | 80 | | |
77 | 81 | | |
| |||
83 | 87 | | |
84 | 88 | | |
85 | 89 | | |
86 | | - | |
| 90 | + | |
87 | 91 | | |
88 | | - | |
| 92 | + | |
89 | 93 | | |
90 | 94 | | |
91 | 95 | | |
| |||
141 | 145 | | |
142 | 146 | | |
143 | 147 | | |
144 | | - | |
145 | | - | |
| 148 | + | |
| 149 | + | |
146 | 150 | | |
147 | 151 | | |
148 | 152 | | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
149 | 157 | | |
150 | 158 | | |
151 | 159 | | |
152 | 160 | | |
153 | 161 | | |
154 | | - | |
| 162 | + | |
155 | 163 | | |
156 | | - | |
| 164 | + | |
157 | 165 | | |
158 | 166 | | |
159 | 167 | | |
| |||
166 | 174 | | |
167 | 175 | | |
168 | 176 | | |
169 | | - | |
| 177 | + | |
170 | 178 | | |
171 | | - | |
| 179 | + | |
172 | 180 | | |
173 | 181 | | |
174 | 182 | | |
| |||
0 commit comments