Logo
Explore Help
Register Sign In
sleepy/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
Code Issues 11 Pull Requests Actions 109 Packages Projects Releases Wiki Activity
Files
b7f5f46e03edbe73abb0784e27faa20efb8a42d5
llama.cpp/.devops
T
History
Sigbjørn Skjæret b7f5f46e03 docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
..
nix
Install rpc-server when GGML_RPC is ON. (#17149)
2025-11-11 10:53:59 +00:00
cann.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
cpu.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
cuda.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
intel.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
llama-cli-cann.Dockerfile
docker : do not build tests (#13204)
2025-04-30 10:44:07 +02:00
llama-cpp-cuda.srpm.spec
repo : update links to new url (#11886)
2025-02-15 16:40:57 +02:00
llama-cpp.srpm.spec
repo : update links to new url (#11886)
2025-02-15 16:40:57 +02:00
musa.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
rocm.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
s390x.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
tools.sh
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
vulkan.Dockerfile
docker : include legacy llama-completion binary (#17964)
2025-12-12 19:39:23 +01:00
Powered by Gitea Version: 1.26.1 Page: 104ms Template: 1ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API