Skip to content

Commit 45172df

Browse files
authored
ci : disable AMX jobs (ggml-org#20654)
[no ci]
1 parent 9b342d0 commit 45172df

1 file changed

Lines changed: 13 additions & 10 deletions

File tree

.github/workflows/build-self-hosted.yml

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -97,19 +97,21 @@ jobs:
9797
vulkaninfo --summary
9898
GG_BUILD_VULKAN=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
9999
100-
ggml-ci-cpu-amx:
101-
runs-on: [self-hosted, Linux, CPU, AMX]
100+
# TODO: provision AMX-compatible machine
101+
#ggml-ci-cpu-amx:
102+
# runs-on: [self-hosted, Linux, CPU, AMX]
102103

103-
steps:
104-
- name: Clone
105-
id: checkout
106-
uses: actions/checkout@v6
104+
# steps:
105+
# - name: Clone
106+
# id: checkout
107+
# uses: actions/checkout@v6
107108

108-
- name: Test
109-
id: ggml-ci
110-
run: |
111-
bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
109+
# - name: Test
110+
# id: ggml-ci
111+
# run: |
112+
# bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
112113

114+
# TODO: provision AMD GPU machine
113115
# ggml-ci-amd-vulkan:
114116
# runs-on: [self-hosted, Linux, AMD]
115117

@@ -124,6 +126,7 @@ jobs:
124126
# vulkaninfo --summary
125127
# GG_BUILD_VULKAN=1 bash ./ci/run.sh ~/results/llama.cpp /mnt/llama.cpp
126128

129+
# TODO: provision AMD GPU machine
127130
# ggml-ci-amd-rocm:
128131
# runs-on: [self-hosted, Linux, AMD]
129132

0 commit comments

Comments
 (0)