Logo
Explore Help
Register Sign In
sleepy/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
Code Issues 11 Pull Requests Actions 108 Packages Projects Releases Wiki Activity
Files
40643edb86eb10b471b0f57d4f3f7eb0e06a0df7
llama.cpp/.devops
T
History
Svetlozar Georgiev 40643edb86 sycl: fix docker image (#14144)
2025-06-13 18:32:56 +02:00
..
nix
repo : update links to new url (#11886)
2025-02-15 16:40:57 +02:00
cloud-v-pipeline
build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809)
2024-06-13 00:41:52 +01:00
cpu.Dockerfile
docker : do not build tests (#13204)
2025-04-30 10:44:07 +02:00
cuda.Dockerfile
docker : do not build tests (#13204)
2025-04-30 10:44:07 +02:00
intel.Dockerfile
sycl: fix docker image (#14144)
2025-06-13 18:32:56 +02:00
llama-cli-cann.Dockerfile
docker : do not build tests (#13204)
2025-04-30 10:44:07 +02:00
llama-cpp-cuda.srpm.spec
repo : update links to new url (#11886)
2025-02-15 16:40:57 +02:00
llama-cpp.srpm.spec
repo : update links to new url (#11886)
2025-02-15 16:40:57 +02:00
musa.Dockerfile
musa: Upgrade MUSA SDK version to rc4.0.1 and use mudnn::Unary::IDENTITY op to accelerate D2D memory copy (#13647)
2025-05-21 09:58:49 +08:00
rocm.Dockerfile
docker : do not build tests (#13204)
2025-04-30 10:44:07 +02:00
tools.sh
docker: add perplexity and bench commands to full image (#11438)
2025-01-28 10:42:32 +00:00
vulkan.Dockerfile
docker : do not build tests (#13204)
2025-04-30 10:44:07 +02:00
Powered by Gitea Version: 1.26.1 Page: 146ms Template: 1ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API