Runtime dependency on `hipblas`
Description:
ollama-rocm
seems to have a runtime dependency on hipblas
, since without that package, I get the following journal output:
Mai 02 19:22:33 arpc ollama[7121]: time=2024-05-02T19:22:33.295+02:00 level=INFO source=server.go:127 msg="offload to gpu" reallayers=0 layers=0 required="4576.0 MiB" used="677.5 MiB" available="0 B" kv="256.0 MiB" fulloffload="164.0 MiB" partialoffload="677.5 >
Mai 02 19:22:33 arpc ollama[7121]: time=2024-05-02T19:22:33.296+02:00 level=INFO source=server.go:264 msg="starting llama server" cmd="/tmp/ollama265547122/runners/cpu/ollama_llama_server --model /var/lib/ollama/.ollama/models/blobs/sha256-00e1317cbf74d901080d7>
Mai 02 19:22:33 arpc ollama[7121]: time=2024-05-02T19:22:33.296+02:00 level=INFO source=server.go:389 msg="waiting for llama runner to start responding"
Mai 02 19:22:33 arpc ollama[7121]: /tmp/ollama265547122/runners/cpu/ollama_llama_server: error while loading shared libraries: libhipblas.so.2: cannot open shared object file: No such file or directory
Mai 02 19:22:33 arpc ollama[7121]: time=2024-05-02T19:22:33.346+02:00 level=ERROR source=routes.go:120 msg="error loading llama server" error="llama runner process no longer running: 127 "
Mai 02 19:22:33 arpc ollama[7121]: [GIN] 2024/05/02 - 19:22:33 | 500 | 751.069643ms | 127.0.0.1 | POST "/api/generate"
Additional info:
- package version(s): 0.1.32-1
Steps to reproduce:
- Install
ollama-rocm
- Don't have
hipblas
installed systemctl start ollama
ollama run <some model> <some prompt>