Name and Version
version: 8734 (ddf03c6)
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
Problem description & steps to reproduce
After b8734 gibberish output on CUDA builds (61 architecture 1080 ti cards), but the same version show the same gibberish output without GPU offload also (if I'm not wrong)
I tried to find where bug begin, but at least found that b8730 working find and above. b8740 give gibberish output.
First Bad Commit
not sure something after version: 8734 (ddf03c6)
Relevant log output
No response
Name and Version
version: 8734 (ddf03c6)
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
Problem description & steps to reproduce
After b8734 gibberish output on CUDA builds (61 architecture 1080 ti cards), but the same version show the same gibberish output without GPU offload also (if I'm not wrong)
I tried to find where bug begin, but at least found that b8730 working find and above. b8740 give gibberish output.
First Bad Commit
not sure something after version: 8734 (ddf03c6)
Relevant log output
No response