Skip to content

Ollama fails to run on ROCm 6.2.2

Description:

The build of ollama-rocm in testing fails to run on an RX 7700 XT card, unlike upstream's binaries. It either results in a CUDA error, or a more verbose CUDA error.

Additional info:

Steps to reproduce:

  1. Install ollama-rocm and dependencies from testing
  2. Start ollama.service
  3. Run ollama run llama2
Edited by Christopher Snowhill
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information