Logo
Explore Help
Register Sign In
sleepy/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
Code Issues 16 Pull Requests Actions 162 Packages Projects Releases Wiki Activity
Files
014dca49d6c1c735d58f8bcf4e101f8cc80fbfc5
llama.cpp/.github
T
History
Ruben Ortlam 8dc530b86d ci: disable test-backend-ops on Vulkan llvmpipe run and resture default timeout (#21901)
2026-04-15 10:55:21 +02:00
..
actions
ggml : add OpenVINO backend (#15307)
2026-03-14 07:56:55 +02:00
ISSUE_TEMPLATE
issues: add openvino backends (#20932)
2026-03-24 14:41:10 +08:00
workflows
ci: disable test-backend-ops on Vulkan llvmpipe run and resture default timeout (#21901)
2026-04-15 10:55:21 +02:00
labeler.yml
ci: drop v5 all: composition from labeler.yml (#21627)
2026-04-09 08:20:19 +02:00
pull_request_template.md
contrib: add "Requirements" section to PR template (#20841)
2026-03-23 16:59:02 +01:00
Powered by Gitea Version: 1.26.1 Page: 82ms Template: 1ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API