Ollama fails to run on ROCm 6.2.2
Description:
The build of ollama-rocm in testing fails to run on an RX 7700 XT card, unlike upstream's binaries. It either results in a CUDA error, or a more verbose CUDA error.
Additional info:
- package version(s): ollama-rocm 0.3.12-5
- config and/or log files: ollama.error.txt
- link to upstream bug report, if any: https://github.com/ollama/ollama/issues/7564
Steps to reproduce:
- Install ollama-rocm and dependencies from testing
- Start ollama.service
- Run
ollama run llama2
Edited by Christopher Snowhill