Name and Version
$ bin/llama-cli.exe --version
load_backend: failed to find ggml_backend_init in C:\Users\devcloud\Documents\repo\llama.cpp\build-ci-debug\bin\ggml-vulkan.dll
load_backend: failed to find ggml_backend_init in C:\Users\devcloud\Documents\repo\llama.cpp\build-ci-debug\bin\ggml-cpu.dll
version: 8829 (6990e2f)
built with GNU 15.2.0 for Windows AMD64
Operating systems
Windows
Which llama.cpp modules do you know to be affected?
Core DLLs?
Command line
bin/test-chat-template.exe`
Problem description & steps to reproduce
I'm observing a complete hang at test-chat-template.exe from 6990e2f when building for CI as LLAMA_FATAL_WARNINGS=OFF GG_BUILD_NINJA=1 GG_BUILD_VULKAN=1 GG_BUILD_LOW_PERF=1 ./ci/run.sh ./results/llama.cpp ./mnt/llama.cpp on MSYS2 UCRT64 + GCC. The test execution seems to finish but just hangs at the end and I can't Ctrl-C. Running from gdb is the same (Can't Ctrl-C). fcc7508 works correctly
I see the same issue when built with cmake -B build_vk -DGGML_VULKAN=ON -DLLAMA_CURL=OFF -DBUILD_SHARED_LIBS=ON but it does not occur when built with cmake -B build_vk -DGGML_VULKAN=ON -DLLAMA_CURL=OFF
First Bad Commit
6990e2f (#21936)
Relevant log output
Logs
# build-ci-debug/bin/test-chat-template.exe
Built-in chat templates:
bailing
bailing-think
bailing2
chatglm3
chatglm4
chatml
command-r
deepseek
deepseek-ocr
deepseek2
deepseek3
exaone-moe
exaone3
exaone4
falcon3
gemma
gigachat
glmedge
gpt-oss
granite
granite-4.0
grok-2
hunyuan-dense
hunyuan-moe
hunyuan-ocr
kimi-k2
llama2
llama2-sys
llama2-sys-bos
llama2-sys-strip
llama3
llama4
megrez
minicpm
mistral-v1
mistral-v3
mistral-v3-tekken
mistral-v7
mistral-v7-tekken
monarch
openchat
orion
pangu-embedded
phi3
phi4
rwkv-world
seed_oss
smolvlm
solar-open
vicuna
vicuna-orca
yandex
zephyr
=== teknium/OpenHermes-2.5-Mistral-7B ===
=== mistralai/Mistral-7B-Instruct-v0.2 (NOTE: Old pre-v1 without a system prompt) ===
=== TheBloke/FusionNet_34Bx2_MoE-AWQ ===
=== bofenghuang/vigogne-2-70b-chat ===
=== mlabonne/AlphaMonarch-7B ===
=== google/gemma-7b-it ===
=== OrionStarAI/Orion-14B-Chat ===
=== openchat/openchat-3.5-0106 ===
=== deepseek-ai/deepseek-coder-33b-instruct ===
=== eachadea/vicuna-13b-1.1 ===
=== Orca-Vicuna ===
=== CohereForAI/c4ai-command-r-plus ===
=== Llama-3 ===
=== Phi-3-mini ===
=== Phi-3-small ===
=== Phi-3-medium ===
=== Phi-3-vision ===
=== ChatGLM3 ===
=== ChatGLM4 ===
=== GLMEdge ===
=== MiniCPM-3B-OpenHermes-2.5-v2-GGUF ===
=== DeepSeek-V2 ===
=== ibm-granite/granite-3.0-8b-instruct ===
=== mistralai/Mistral-7B-Instruct-v0.2 (mistralai 'v1' template with a system prompt) ===
=== Mistral-Large-Instruct-2407 (mistralai 'v3' template; modified to have system prompt at start) ===
=== Mistral-Nemo-Instruct-2407 (mistralai 'v3-tekken' template; modified to have system prompt at start) ===
=== mistralai/Mistral-Large-Instruct-2411 (mistralai 'v7' template) ===
=== ai-sage/GigaChat-20B-A3B-instruct ===
=== Infinigence/Megrez-3B-Instruct ===
=== phi-4 ===
=== yandex/YandexGPT-5-Lite-8B-instruct ===
=== inclusionAI/Ling-lite ===
=== ByteDance-Seed/Seed-OSS-36B-Instruct ===
=== ibm-granite/granite-3.x (tool call) ===
=== ibm-granite/granite-4.0 (tool call) ===
=== teknium/OpenHermes-2.5-Mistral-7B (jinja) ===
=== mistralai/Mistral-7B-Instruct-v0.2 (NOTE: Old pre-v1 without a system prompt) (jinja) ===
=== TheBloke/FusionNet_34Bx2_MoE-AWQ (jinja) ===
=== bofenghuang/vigogne-2-70b-chat (jinja) ===
=== mlabonne/AlphaMonarch-7B (jinja) ===
=== google/gemma-7b-it (jinja) ===
=== OrionStarAI/Orion-14B-Chat (jinja) ===
=== openchat/openchat-3.5-0106 (jinja) ===
=== deepseek-ai/deepseek-coder-33b-instruct (jinja) ===
=== eachadea/vicuna-13b-1.1 (jinja) ===
=== Orca-Vicuna (jinja) ===
=== CohereForAI/c4ai-command-r-plus (jinja) ===
=== Llama-3 (jinja) ===
=== Phi-3-mini (jinja) ===
=== Phi-3-small (jinja) ===
=== Phi-3-medium (jinja) ===
=== Phi-3-vision (jinja) ===
=== ChatGLM3 (jinja) ===
=== ChatGLM4 (jinja) ===
=== GLMEdge (jinja) ===
=== MiniCPM-3B-OpenHermes-2.5-v2-GGUF (jinja) ===
=== DeepSeek-V2 (jinja) ===
=== ibm-granite/granite-3.0-8b-instruct (jinja) ===
=== mistralai/Mistral-7B-Instruct-v0.2 (mistralai 'v1' template with a system prompt) (jinja) ===
=== Mistral-Large-Instruct-2407 (mistralai 'v3' template; modified to have system prompt at start) (jinja) ===
=== Mistral-Nemo-Instruct-2407 (mistralai 'v3-tekken' template; modified to have system prompt at start) (jinja) ===
=== mistralai/Mistral-Large-Instruct-2411 (mistralai 'v7' template) (jinja) ===
=== Infinigence/Megrez-3B-Instruct (jinja) ===
=== phi-4 (jinja) ===
=== yandex/YandexGPT-5-Lite-8B-instruct (jinja) ===
render_message_to_json: Neither string content nor typed content is supported by the template. This is unexpected and may lead to issues.
=== inclusionAI/Ling-lite (jinja) ===
=== ByteDance-Seed/Seed-OSS-36B-Instruct (jinja) ===
=== ibm-granite/granite-3.x (tool call) (jinja) ===
=== ibm-granite/granite-4.0 (tool call) (jinja) ===
OK: All tests passed successfully.
<-- HANGS HERE
Name and Version
$ bin/llama-cli.exe --version
load_backend: failed to find ggml_backend_init in C:\Users\devcloud\Documents\repo\llama.cpp\build-ci-debug\bin\ggml-vulkan.dll
load_backend: failed to find ggml_backend_init in C:\Users\devcloud\Documents\repo\llama.cpp\build-ci-debug\bin\ggml-cpu.dll
version: 8829 (6990e2f)
built with GNU 15.2.0 for Windows AMD64
Operating systems
Windows
Which llama.cpp modules do you know to be affected?
Core DLLs?
Command line
bin/test-chat-template.exe`Problem description & steps to reproduce
I'm observing a complete hang at test-chat-template.exe from 6990e2f when building for CI as
LLAMA_FATAL_WARNINGS=OFF GG_BUILD_NINJA=1 GG_BUILD_VULKAN=1 GG_BUILD_LOW_PERF=1 ./ci/run.sh ./results/llama.cpp ./mnt/llama.cppon MSYS2 UCRT64 + GCC. The test execution seems to finish but just hangs at the end and I can't Ctrl-C. Running from gdb is the same (Can't Ctrl-C). fcc7508 works correctlyI see the same issue when built with
cmake -B build_vk -DGGML_VULKAN=ON -DLLAMA_CURL=OFF -DBUILD_SHARED_LIBS=ONbut it does not occur when built withcmake -B build_vk -DGGML_VULKAN=ON -DLLAMA_CURL=OFFFirst Bad Commit
6990e2f (#21936)
Relevant log output
Logs