Skip to content

rocm library not detected

Description:

ollama won't detect the ROCm library, leading to CPU being used for inference instead of the GPU.

Additional info:

Steps to reproduce:

  1. Install and enable ollama through systemd.
  2. ollama run mistral
  3. Logs show that rocm library is not found, this is the relevant line:
level=INFO source=payload_common.go:139 msg="Dynamic LLM libraries [cpu_avx2 cpu_avx cpu]"

I built ollama from the v0.1.29 tag and it correctly detects ROCm library and GPU inference works. The log in that case is:

level=INFO source=payload_common.go:139 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 rocm]"
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information