This website requires JavaScript.
Explore
Help
Register
Sign In
sleepy
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
Code
Issues
11
Pull Requests
Actions
109
Packages
Projects
Releases
Wiki
Activity
Files
9c55e5c5c24990dd17fd6c8f4f2159052d2b06f1
llama.cpp
/
.devops
T
History
Xuan-Son Nguyen
da84c04d8f
docker : do not build tests (
#13204
)
...
* docker : do not build tests * include "ggml-cpu.h"
2025-04-30 10:44:07 +02:00
..
nix
repo : update links to new url (
#11886
)
2025-02-15 16:40:57 +02:00
cloud-v-pipeline
build
: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (
#7809
)
2024-06-13 00:41:52 +01:00
cpu.Dockerfile
docker : do not build tests (
#13204
)
2025-04-30 10:44:07 +02:00
cuda.Dockerfile
docker : do not build tests (
#13204
)
2025-04-30 10:44:07 +02:00
intel.Dockerfile
docker : do not build tests (
#13204
)
2025-04-30 10:44:07 +02:00
llama-cli-cann.Dockerfile
docker : do not build tests (
#13204
)
2025-04-30 10:44:07 +02:00
llama-cpp-cuda.srpm.spec
repo : update links to new url (
#11886
)
2025-02-15 16:40:57 +02:00
llama-cpp.srpm.spec
repo : update links to new url (
#11886
)
2025-02-15 16:40:57 +02:00
musa.Dockerfile
docker : do not build tests (
#13204
)
2025-04-30 10:44:07 +02:00
rocm.Dockerfile
docker : do not build tests (
#13204
)
2025-04-30 10:44:07 +02:00
tools.sh
docker: add perplexity and bench commands to full image (
#11438
)
2025-01-28 10:42:32 +00:00
vulkan.Dockerfile
docker : do not build tests (
#13204
)
2025-04-30 10:44:07 +02:00