Hello all,
I used to have a working CLI app cudnn. Now the binary is missing, this could be a regression in package cudnn? What do you say?
Hello all,
I used to have a working CLI app cudnn. Now the binary is missing, this could be a regression in package cudnn? What do you say?
There is no executable in the repo package indeed. This package is directly taken from arch, so if you are sure anything else than .so modules and sources should be there and was forgotten, write the packager.
Or try one of the many many aur versions of that package.
There is no executable in the source tarball, either. There are only lib and include folders with a license.
Please don’t bother them. This is not a packaging issue.
my bad: the binary I was looking for is still installed as nvidia-smi
Unfortunately my ollama install doesn’t run on the cuda GPU anymore:
time=2025-11-21T14:57:08.520+01:00 level=INFO source=types.go:60 msg=“inference compute” id=cpu library=cpu compute=“” name=cpu description=cpu libdirs=ollama driver=“” pci_id=“” type=“” total=“62.7 GiB” available=“50.0 GiB”
time=2025-11-21T14:57:08.520+01:00 level=INFO source=routes.go:1638 msg=“entering low vram mode” “total vram”=“0 B” threshold=“20.0 GiB”
I am clueless…