rocm library not detected
Description:
ollama won't detect the ROCm library, leading to CPU being used for inference instead of the GPU.
Additional info:
- package version(s): 0.1.29-1
- link to upstream bug report, if any: https://github.com/ollama/ollama/issues/2411
Steps to reproduce:
- Install and enable ollama through systemd.
ollama run mistral
- Logs show that rocm library is not found, this is the relevant line:
level=INFO source=payload_common.go:139 msg="Dynamic LLM libraries [cpu_avx2 cpu_avx cpu]"
I built ollama from the v0.1.29 tag and it correctly detects ROCm library and GPU inference works. The log in that case is:
level=INFO source=payload_common.go:139 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 rocm]"