This website requires JavaScript.
Explore
Help
Register
Sign In
sleepy
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
Code
Issues
11
Pull Requests
Actions
114
Packages
Projects
Releases
Wiki
Activity
Files
2bed4aa3f37cb4e39e16e9ec7b595a7738fd5faf
llama.cpp
/
.devops
T
History
Xuan Son Nguyen
2bed4aa3f3
devops : add intel oneapi dockerfile (
#5068
)
...
Co-authored-by: Xuan Son Nguyen <
xuanson.nguyen@snowpack.eu
>
2024-01-23 09:11:39 +02:00
..
nix
nix: add a comment on the many nixpkgs-with-cuda instances
2024-01-22 12:19:30 +00:00
cloud-v-pipeline
ci : Cloud-V for RISC-V builds (
#3160
)
2023-09-15 11:06:56 +03:00
full-cuda.Dockerfile
python : add check-requirements.sh and GitHub workflow (
#4585
)
2023-12-29 16:50:29 +02:00
full-rocm.Dockerfile
python : add check-requirements.sh and GitHub workflow (
#4585
)
2023-12-29 16:50:29 +02:00
full.Dockerfile
python : add check-requirements.sh and GitHub workflow (
#4585
)
2023-12-29 16:50:29 +02:00
llama-cpp-clblast.srpm.spec
devops : added systemd units and set versioning to use date. (
#2835
)
2023-08-28 09:31:24 +03:00
llama-cpp-cublas.srpm.spec
devops : added systemd units and set versioning to use date. (
#2835
)
2023-08-28 09:31:24 +03:00
llama-cpp.srpm.spec
devops : added systemd units and set versioning to use date. (
#2835
)
2023-08-28 09:31:24 +03:00
main-cuda.Dockerfile
docker : add git to full-cuda.Dockerfile main-cuda.Dockerfile (
#3044
)
2023-09-08 13:57:55 +03:00
main-intel.Dockerfile
devops : add intel oneapi dockerfile (
#5068
)
2024-01-23 09:11:39 +02:00
main-rocm.Dockerfile
python : add check-requirements.sh and GitHub workflow (
#4585
)
2023-12-29 16:50:29 +02:00
main.Dockerfile
Add llama.cpp docker support for non-latin languages (
#1673
)
2023-06-08 00:58:53 -07:00
tools.sh
docker : add finetune option (
#4211
)
2023-11-30 23:46:01 +02:00