Skip to content

Commit 2f11c0c

Browse files
authored
Arm backend: Avoid inplace buffer modification (#17901)
For some buffers with grad=True, this triggered Runtime error: a view of a leaf Variable that requires grad is being used in an in-place operation. Signed-off-by: Erik Lundell <erik.lundell@arm.com>
1 parent 427bc4e commit 2f11c0c

1 file changed

Lines changed: 2 additions & 3 deletions

File tree

backends/arm/_passes/replace_inf_and_limit_values_pass.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -52,9 +52,8 @@ def call(self, graph_module: torch.fx.GraphModule):
5252

5353
modified = True
5454
# 255 here is mainly for attention_mask in Llama for reasonable quant scale
55-
tensor[tensor == float("inf")] = 255
56-
tensor[tensor == float("-inf")] = -255
57-
setattr(graph_module, buf_name, tensor)
55+
t = torch.nan_to_num(tensor, posinf=255, neginf=-255)
56+
setattr(graph_module, buf_name, t)
5857

5958
for node in graph_module.graph.nodes:
6059
arg_list = list(node.args)

0 commit comments

Comments
 (0)