This website requires JavaScript.
Explore
Help
Register
Sign In
sleepy
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
Code
Issues
8
Pull Requests
Actions
168
Packages
Projects
Releases
Wiki
Activity
Files
423cf0b26fc0b72ff4bb4656a68a607d38b95fe5
llama.cpp
/
tools
/
server
/
webui
/
tests
T
History
Aleksander Grygier
4c61875bf8
webui: Add switcher to Chat Message UI to show raw LLM output (
#19571
)
2026-02-12 19:55:51 +01:00
..
client
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
e2e
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
stories
webui: Add switcher to Chat Message UI to show raw LLM output (
#19571
)
2026-02-12 19:55:51 +01:00
unit
webui: Improve copy to clipboard with text attachments (
#17969
)
2025-12-16 07:38:46 +01:00