>>> looks i will need to build ollama so it will support my amd glx1036...: Welcome to the ollama-for-amd wiki!
...
... This wiki aims to extend support for AMD GPUs that Ollama Official doesn't currently cover due to limitations i
... n official ROCm on Windows. The goal is to remove these GPU limitations and include support for more AMD graphi
... cs card models. Remember, this help is provided voluntarily by community.
...
... Important
...
... "This repository is essentially a fork of ollama, primarily modified the gpu lists to work with more AMD GPUs.
... It doesn't introduce any new features beyond that."
...
... (For those who prefer another language, right-click and use the auto-translate feature in your browser or devic
... e. Most platforms have this built-in!.)
...
... Need Help?
...
... issues? Please share your server logs in the issue section .also check out the official ollama issue tracker: (
... ollama/issues) for more in-depth assistance or get help from similiar issue.If you suspect the issue originates
... from the Ollama program itself, please hold off on further troubleshooting for now. ollama teams are working o
... n updates to address this problem. Keep an eye out for a new release!"
...
... Tip
...
... Compilation Not Required? Try the Demo!
...
... If you're not comfortable compiling or build the software yourself, no problem! You can skip the development st
... eps and jump straight to using the ** Step 7 ,Demo Release Version**.
...
... To use the demo, simply replace the rocblas files in the Rocblas Support Section with those provided in the dem
... o release.
...
... **Development Guide **
... ("This guide outlines how the repository builds the program. You don't need to follow these steps unless you'r
... e interested in customizing the build process. One benefit of following these steps is that the compiled progra
... m may take up less space, as it will only be optimized for your specific GPU architecture instead of a wide ran
... ge of architectures.").
...
... 1, Prepare Development Enviroment
... Follow the guide on development.md for Windows development setup.
...
... Downloading & Editing
... 2, Prepare Scripts
... Clone the Ollama Repository:
... git clone https://github.com/ollama/ollama.git
... or simpley clone this repo.
...
... Edit the CMakePresets.json File:
... Locate this file:CMakePresets.json
...
... Add your GPU model to the existing list at line 59. Here's an example of the official supported list:
... "AMDGPU_TARGETS": "gfx900;gfx940;gfx941;gfx942;gfx1010;gfx1012;gfx1030;gfx1100;gfx1101;gfx1102;gfx906:xnack-;gf
... x908:xnack-;gfx90a:xnack+;gfx90a:xnack-"
... Throughout the examples AMDGPU_TARGETS is all the arches,to speed up the build. You can build with your need ar
... ches to reduce build times. eg, DAMDGPU_TARGETS="gfx1031;"
... Add your desired GPU models to this list (see examples in the original text). Example Additional GPUs for This
... Repo:(AMD-GPU-Arches-lists-Info) or get info from techpowerup.com
... "gfx803"(rocm5),"gfx900:xnack-"(rocm5), "gfx902" (rocm5),"gfx90c:xnack-"(rocm5,only available at self-build),//
... "gfx906:xnack-","gfx1010:xnack-", "gfx1011", "gfx1012:xnack-", "gfx1031", "gfx1032", "gfx1034", "gfx1035", "gfx
... 1036","gfx1103", "gfx1150"
...
... while also need to edit CMakeLists.txt in line 99
...
... official support lists .
...
... list(FILTER AMDGPU_TARGETS INCLUDE REGEX "^gfx(900|94[012]|101[02]|1030|110[012])$")
... meaning : gfx900 gfx940、gfx941、gfx942 gfx1010、gfx1012 gfx1030 gfx1100、gfx1101、gfx1102
...
... update list on this repo
...
... list(FILTER AMDGPU_TARGETS INCLUDE REGEX "^gfx(803|900(:xnack-)|902|90c(:xnack-)|1010(:xnack-)|1011|1012(:xnack
... -)|103[0-6]|110[0-3]|1150)$")
... meaning :gfx803 ;gfx900:xnack-;gfx902;gfx906:xnack-;gfx90c:xnack-;gfx1010:xnack-; gfx1011;gfx1012:xnack
... -;gfx1030 to gfx1036; gfx1100 to gfx1103; 1150:gfx1150;
...
... ROCm Support
... 3, Prepare ROCm
... Ensure you have a compatible Hip SDK installed. Install HIP SDK 6.1.2/HIP SDK 6.2.4 from hip-sdk.
... Official AMD ROCm Supported GPUs: You can find a complete list of officially supported GPUs here: rocm.docs.amd
... .com
...
... For supported GPUs, please use the official ROCm version from AMD also the Ollama Official. You don't need to b
... uild anything unless you want to learn how. (Examples:gfx900 gfx940 gfx941 gfx942 gfx1010 gfx1012 gfx1030 gfx11
... 00 gfx1101 gfx1102)
...
... If your GPU is not on the list:
...
... Try pre-built ROCm libraries:
... Some pre-built rocblas libraries are available for specific GPUs at:
... ROCmLibs for HIP SDK 6.1.2
...
... ROCmLibs for HIP SDK 6.2.4
...
... Note
...
... **Using ROCm5 Build Refer to **build-ollama-with-ROCm5
...
... Choose the appropriate library based on your GPU model (e.g., gfx902, gfx1031).
... Building ROCm libraries:
...
... You may need to build rocblas yourself if a pre-built version is not available for your GPU.
... Refer to the Rocm documentation for detailed building instructions. learn how to build rocblas
... Installing and Replacing Libraries:
...
... Install HIP SDK: This will create a C:\Program Files\AMD\ROCm\6.1\bin folder.
... Place rocblas.dll in the C:\Program Files\AMD\ROCm\6.1\bin folder. Replace any existing rocblas.dll.
... Replace library within rocblas\library: This ensures your code uses the correct library files.
... Add bin in the PATH: eg , C:\Program Files\AMD\ROCm\6.1\bin\ to Path( systeme envrimental , PATH , most people
... forgot this)
... (change the path to 6.2 if you want to work on HIP 6.2 )
...
... Community Support:
...
... If you encounter issues or need assistance in rocblas and library, please leave a message on the project's issu
... es page for community support.
... 4 , Building the ollama.exe:**
... cmake -B build -G Ninja -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++
... Then run
...
... cmake --build build --config Release
... or
...
... ninja -C build
... 6, Building the Installer:**
... Install Inno Setup: To build an installer for Ollama, you'll need to install Inno Setup: https://jrsoftware.org
... /isinfo.php.
... Run PowerShell Script: In the top directory of the Ollama repository, run this PowerShell script:
... powershell -ExecutionPolicy Bypass -File .\scripts\build_windows.ps1
... The Installer: After the build is complete, you'll find the OllamaSetup.exe installer in the dist folder. This
... will work exactly like the official release.
... If The Installer Build Broken in recent update: OllamaSetup.exe installer in the dist folder has not package al
... l the build libs in build\lib\ollama and rocmlibs. if that's a necessary steps for you . simply manuly copy it
... in the Ollama program folder after installed the OllamaSetup.exe (ie,C:\Users\UsrName\AppData\Local\Programs\Ol
... lama\lib\ollama\.. and rocm as show in demo release ollama-windows-amd64.7z on v0.5.9 or ollama official ) .
... 7, Demo Release Version:
... Test Drive Ollama: If you want to test before building from source, download and install a demo release from ol
... lama-for-amd/releases.
... Install the Base Ollama Setup: Start by running OllamaSetup.exe. This sets up the basic framework for Ollama.(o
... ptional,If you skip these steps, make sure to remove any existing Ollama clients to avoid version conflicts. re
... move the ollama program directory or delete it not uninstall . As the uninstall will also remove ollama downloa
... d models)
...
... (optional)Unzip and Replace Libraries( update this steps no longer necessary if you had do the first steps unle
... ss you do not have rocm directory inside your ollama directory: Download ollama-windows-amd64.7z and unzip it.
... Then, replace the files within C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama with those from the ex
... tracted archive.
...
... Match ROCm to Your GPU: Download the appropriate ROCm libraries for your GPU architecture from ROCmlibs for 6.1
... .2. or ROCmLibs for HIP SDK 5.7 (based on each release package)
...
... Important: Ensure the rocblas.dll and rocblas/library folder you choose match your GPU architecture.
... Replace the existing files in your Ollama program folder (ie,C:\Users\UsrName\AppData\Local\Programs\Ollama\lib
... \ollama\rocm) with these downloaded libraries .
... Start Serving: Once everything is installed, run ollama run llama3.1 in CMD OR Powershell to start your model o
... r Skip step1 , navigate to your ollama-windows-amd64 program directory and replace with your target rocmlibs an
... d run ./ollama serve.
...
... This repository serves as an example, and won't be regularly updated.
...
... Try pre-built ROCm libraries:
...
... Some pre-built rocblas libraries are available for specific GPUs at:
... ROCmLibs for HIP SDK 6.1.2
...
... ROCmLibs for HIP SDK 5.7
...
... ROCmLibs for HIP SDK 6.2.4
...
... Choose the appropriate library based on your GPU model (e.g., gfx902, gfx1032).
... HIP SDK 6.2.4 Notes: The ollama build with hip 6.1 should also works with rocmlibs for 6.2.4.
...
... Multi gpu Notes: if your has multi gpu and the arches not on the support lists . please consider build your roc
... mlibs and olloma as the guide in development-guide .
...
... Note
...
... Why the install steps are so complex The installation process might seem complex because the ROCm libraries are
... quite large. The Ollama package for all supported architectures can be over 2GB in zipped format, and even lar
... ger when unzipped – which you don't actually need.To simplify this by breaking it into two manageable steps.For
... an even easier solution, consider using ByronLeeeee's Ollama-For-AMD-Installer. It streamlines the update and
... replacement of libraries with a simple click.
...
... We aim to simplify this by breaking it into two manageable steps.
...
... For an even easier solution, consider using ByronLeeeee's Ollama-For-AMD-Installer. It streamlines the update a
... nd replacement of libraries with a simple click.
...
... Note
...
... gfx1103 (AMD 780M) is natively supported by this repository, so no file replacements are needed. For v0.5.4 or
... before usage refer to previous-emo-release-version
...
... gfx803 hip memory error fixed by this rocm 5.7 libsrocm.gfx803.optic.test.version.7z for rocm5.7 in rocm 5.7 ,r
... eceived feedback for gfx900 series ( gfx803;gfx900:xnack,gfx90c:xnack-...) can not detacted in Hip SDK 6.1.2 an
... d forward . please consider build ollama with rocm5.7 yourself if there is no ollama-windows-amd64-rocm5.7.7z p
... ackage available at release page .the ROCmlibs for 5.7 available at HIP SDK 5.7
...
... Note
...
... "The HSA_OVERRIDE_GFX_VERSION approach no longer works.Optional by force enable it in self-build gfx90c:xnack-,
... gfx1010:xnack-, gfx1012:xnack- and Similar Architectures:
...
... Start by trying the pre-built rocblas and library files for your GPU. *If you encounter issues, try setting the
... environment variable HSA_OVERRIDE_GFX_VERSION=9.0.12(for gfx90c),HSA_OVERRIDE_GFX_VERSION=10.1.2( gfx1012
... ) . Refer to the Windows documentation for guidance on setting environment variables: https://www.computerhope
... .com/issues/ch000549.htm.
...
... Running Your Models: After setting up ROCm and Ollama, run your models in a separate shell using the command: .
... /ollama run [model name], replacing "[model name]" with the actual name of your model.
...
... and start the server in C:\Users\usrname\AppData\Local\Programs\Ollama\:
...
... ./ollama serve
... Finally, in a separate shell, run a model(eg):
...
... ./ollama run llama3
... Update Carefully: If you are using a demo release, DO NOT click the "Update" button if Ollama prompts you. Down
... load updates manually from this release page: ollama-for-amd/releases
...
... or update via Ollama-For-AMD-Installer created by ByronLeeeee, a simply click will do the update and replace li
... bs process .
...
... Trouble shootings on Windows:
... error 0xc0000005
...
... make sure get the latest drivers.and replace the files match your gpu.
... some proven solutions as shows in 8#,14#,18#,by mannuly move the all the .dll files in Ollama\lib\ollama and ro
... cblas folder into Ollama\lib\ollama\runners\rocm_v6.1.
... No GPU detected
...
... evel=INFO source=gpu.go:386 msg="no compatible GPUs were discovered
...
... install HIPSDK , add hipinfo in path , ie, rocm/bin in the path,refer to hipinfo#, that's usually happens in RO
... Cm5.7 package or Release.
...
... Hide Nvida GPU detected
...
... CUDA_VISIBLE_DEVICES=-1,0,1,2... can be get info by run nvidia-smi -L
...
... Hide AMD GPU detected
...
... HIP_VISIBLE_DEVICES=-1,0,1,2... can be get info by run hipinfo
...
... Multiple AMD GPU OVERRIDE Have mulitiple OVERRIDE_GFX_VERSION can add extra tag 0,1 behidn version
...
... HSA_OVERRIDE_GFX_VERSION_0=10.3.0 HSA_OVERRIDE_GFX_VERSION_1=11.0.0
...
... Troubleshooting AMD GPU Support in Linux:
... For users who prefer a self-build approach or use ollama official release on unsupported arches, setting enviro
... nment variables like HSA_OVERRIDE_GFX_VERSION=9.0.0 or 9.0.6, 10.1.0, 10.3.0, 11.0.0 can enable GPU support on
... Linux systems. Configuring Environment Variables
...
... Setting on Linux For Linux, set variables using systemd: 1, Edit your systemd service:
...
... sudo systemctl edit ollama.service
... 2, Add the following line under the [Service]
...
... Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
... Have mulitiple GPUs ,can add extra tag 0,1 behidn version
...
... HSA_OVERRIDE_GFX_VERSION_0=10.3.0 HSA_OVERRIDE_GFX_VERSION_1=11.0.0
...
... 3, Reload the service and restart:
...
... $ sudo systemctl daemon-reload
... $ sudo systemctl restart ollama.service
... However, some users may encounter issues such as:
...
... "amdgpu too old gfx%d%x%x"
... "unsupported Radeon iGPU detected skipping"
... "unable to verify rocm library: %w", err)
... These errors due to incompatibility between the installed AMDGPU drivers and rocm supports issue. To potentiall
... y resolve these issues, consider modifying the discover/amd_linux.go file within your ollama directory if the H
... SA_OVERRIDE_GFX_VERSIONnot works for you.
...
... Caution: Editing this file can have unintended consequences. It's recommended to back up the original file befo
... re making any changes.
...
... Specifically, focus on lines 295 to 320 and either delete or edit the relevant content. The exact modifications
... required will depend on your specific hardware and driver version. Refer to online forums and documentation fo
... r guidance on tailoring these edits for your situation.
Below is a **quick‑start checklist** that will get you from a fresh Windows machine to an Ollama build that
can run on an **AMD Radeon R7 570 GLX‑1036** (arch `gfx1036`).
Everything is written with the assumption you’re on Windows 11 (the most common target for the community
builds).
> **TL;DR** – If you only want the final binary, download the *demo release* from the repo’s *releases* page
and skip the rest of this guide.
> If you want a custom build (smaller image, fewer supported GPUs, or you need to patch something), follow
the “Build from source” section below.
---
## 1. Verify your GPU arch
1. Open a PowerShell window.
2. Run
```powershell
hipinfo | findstr "gfx"
```
You should see a line like:
```
gpu id 0: AMD R7 570
          gfx arch: gfx1036
          ...
```
If `hipinfo` is not found, install the **HIP SDK 6.1.2** (recommended) or 6.2.4.
After installation, add `C:\Program Files\AMD\ROCm\6.1\bin` (or 6.2) to your system PATH.
---
## 2. Pick your build route
| Route | What you get | When to use it |
|-------|--------------|----------------|
| **Demo release** | One‑click installer with the pre‑built `rocblas.dll` for *gfx1036* | You just want it to
work – no compilation |
| **Self‑build (recommended for developers)** | Optimised binary for your GPU (smaller image, fewer debug
symbols) | You want a clean build or need to patch code |
| **Pre‑built ROCm libraries + official Ollama** | You’ll replace the binary‑level `rocblas.dll` at runtime |
Your GPU is on the community list but you want to avoid compiling the whole app |
For GLX‑1036, the community *demo release* already contains a `rocblas.dll` for `gfx1036`.
If you prefer a self‑build, you’ll need the matching `rocblas.dll` from the **ROCm 6.1.2** or **6.2.4**
packages.
---
## 3. Self‑build Ollama (from source)
> **NOTE** – The steps below assume you have `git`, `make` (MSYS2 or Chocolatey’s `make`), and a recent C++
compiler installed (VS 2022 is fine).
> The community build uses **CMake** for everything.
### 3.1 Clone the repo
```powershell
git clone https://github.com/your‑copy/ollama-for-amd.git
cd ollama-for-amd
```
> If the repo you copied from is your fork, change the URL accordingly.
### 3.2 Edit the build presets
Open `CMakePresets.json` in a text editor and make sure the `AMDGPU_TARGETS` entry contains `gfx1036`:
```json
"configurations": [
  {
    "name": "x64",
    "generator": "Ninja",
    "hidden": true,
    "settings": {
      "CMAKE_SYSTEM_NAME": "Windows",
      "CMAKE_CXX_STANDARD": "20",
      "CMAKE_CXX_STANDARD_REQUIRED": "ON",
      "CMAKE_CXX_EXTENSIONS": "OFF",
      "CMAKE_BUILD_TYPE": "Release",
      "CMAKE_CXX_FLAGS_RELEASE": "-O3 -march=amdgcn -mtune=amdgcn",
      "CMAKE_CXX_FLAGS_DEBUG": "-g -O0",
      "CMAKE_CXX_FLAGS": "-Wall -Wextra -Werror",
      "CMAKE_EXE_LINKER_FLAGS": "-static-libgcc -static-libstdc++",
      "AMDGPU_TARGETS": "gfx1036"
    }
  }
]
```
> **Why `gfx1036`?**
> Your GPU reports `gfx1036`, which is the same architecture name that ROCm uses (`gfx` + `arch`). Adding it
to `AMDGPU_TARGETS` tells the compile‑time GPU discovery code to accept this GPU.
### 3.3 Install the ROCm runtime
You have two options:
| Option | How |
|--------|-----|
| **Pre‑built rocblas** | Grab the zip file `ROCmLibs‑6.1.2‑gfx1036.zip` (or 6.2.4) from the repo’s *Assets*
page. |
| **Self‑compile rocblas** | `build-rocm.sh` or `build-rocm.bat` (see community scripts) – this is *only*
needed if the pre‑built zip doesn’t exist. |
For GLX‑1036 we’re happy with the *pre‑built* version shipped in the releases page:
```
https://github.com/your‑copy/ollama-for-amd/releases/download/v0.5.6/rocm6.1-gfx1036.zip
```
Download it, unzip somewhere (e.g. `C:\Temp\rocblas_gfx1036`), then copy:
```powershell
copy C:\Temp\rocblas_gfx1036\rocblas.dll C:\Program Files\AMD\ROCm\6.1\bin
```
> **Tip** – The `rocblas.dll` lives in the same folder as `hipinfo`. Keeping it there makes the loader find
it automatically.
### 3.4 Build Ollama
```powershell
# 1. Create a build directory
mkdir build
cd build
# 2. Configure with CMake (the preset we edited above will be used automatically)
cmake .. -G Ninja
# 3. Build
cmake --build . --config Release
# 4. Optional: Strip debug symbols for a smaller package
strip.exe .\ollama.exe
```
If everything goes well you’ll have an `ollama.exe` in the build folder that can see your GPU.
> **Common pitfall** – If you get a *“no compatible GPUs discovered”* message, double‑check that the
`hipinfo` path is in your current shell’s PATH.
---
## 4. Install & test
### 4.1 Install the binary
1. **Demo** – Drag the downloaded installer into `C:\` and run it.
2. **Self‑build** – Copy `ollama.exe` into `C:\Users\<you>\AppData\Local\Programs\Ollama\bin`.
> The `ollama.exe` from the repo already ships with the correct runtime libraries for your arch, so no extra
`rocblas.dll` replacement is necessary.
### 4.2 Start the server
```powershell
cd "C:\Users\<you>\AppData\Local\Programs\Ollama"
.\ollama serve
```
Open **another** PowerShell window and run a model:
```powershell
.\ollama run llama3.1
```
You should see log output like:
```
INFO source=gpu.go:386 msg="found compatible GPU: gfx1036"
INFO run llama3.1: success
```
If you see `evel=INFO source=gpu.go:386 msg="no compatible GPUs were discovered"` it means the binary
couldn’t see `hipinfo`.
Set the environment variable (once):
```powershell
setx HSA_OVERRIDE_GFX_VERSION 10.3.0
```
(you can also set `HIP_VISIBLE_DEVICES=-1` to hide other GPUs).
---
## 5. Common troubleshooting
| Symptom | Quick fix |
|---------|-----------|
| **Access violation 0xc0000005** | Update to the *latest* Radeon driver and delete any old `*.dll` files
left over from the official Ollama installer. |
| **No GPU detected** | Verify `hipinfo` is on PATH, run `hipinfo` again, or try
`HSA_OVERRIDE_GFX_VERSION=10.3.0` in an **environment variable**. |
| **Update prompts from Ollama** | Don’t click the “Update” button if you’re using a community build. Pull
the newest release manually from the repo’s releases page or use the “Ollama‑For‑AMD‑Installer” script that
replaces libs automatically. |
| **Multiple GPUs** | If you have more than one `gfx1036` or other GPUs, set: `HIP_VISIBLE_DEVICES=0,1` or
`HSA_OVERRIDE_GFX_VERSION_0=10.3.0 HSA_OVERRIDE_GFX_VERSION_1=10.3.0`. The binary will pick the first listed
GPU. |
---
## 6. Final words
* The **demo release** is *one click* and ships the exact `rocblas.dll` for `gfx1036`. If you’re a “just
want it to work” user, that’s the easiest path.
* The **self‑build** path gives you a clean binary that is ~1 GB instead of the 2 GB+ all‑arch installer, and
it removes the extra GPUs that you never plan to use.
* If you hit a wall, drop the error message (or `hipinfo` output) into a GitHub issue on the *ollama‑for‑amd*
repo – the community has already solved most of the headaches.
Happy ollama‑ting on your Radeon 570!