This website requires JavaScript.
Explore
Help
Register
Sign In
sleepy
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
Code
Issues
11
Pull Requests
Actions
114
Packages
Projects
Releases
Wiki
Activity
Files
a4f011e8d02179f032627130f961eb77ee30401c
llama.cpp
/
.github
/
workflows
T
History
Eve
a4f011e8d0
vulkan: linux builds + small subgroup size fixes (
#11767
)
...
* mm subgroup size * upload vulkan x86 builds
2025-02-14 02:59:40 +00:00
..
bench.yml.disabled
ggml-backend : add device and backend reg interfaces (
#9707
)
2024-10-03 01:49:47 +02:00
build.yml
vulkan: linux builds + small subgroup size fixes (
#11767
)
2025-02-14 02:59:40 +00:00
close-issue.yml
ci : do not stale-close roadmap issues
2025-02-04 09:31:01 +02:00
docker.yml
ci : fix build CPU arm64 (
#11472
)
2025-01-29 00:02:56 +01:00
editorconfig.yml
ci : pin dependency to specific version (
#11137
)
2025-01-08 12:07:20 +01:00
gguf-publish.yml
ci : update checkout, setup-python and upload-artifact to latest (
#6456
)
2024-04-03 21:01:13 +03:00
labeler.yml
labeler.yml: Use settings from ggerganov/llama.cpp [no ci] (
#7363
)
2024-05-19 20:51:03 +10:00
python-check-requirements.yml
py : fix requirements check '==' -> '~=' (
#8982
)
2024-08-12 11:02:01 +03:00
python-lint.yml
ci : add ubuntu cuda build, build with one arch on windows (
#10456
)
2024-11-26 13:05:07 +01:00
python-type-check.yml
ci : reduce severity of unused Pyright ignore comments (
#9697
)
2024-09-30 14:13:16 -04:00
server.yml
server : (webui) migrate project to ReactJS with typescript (
#11688
)
2025-02-06 17:32:29 +01:00