llama-server: fix model params not propagated #21509
+6
−3
Merged
Loading