This website requires JavaScript.
Explore
Help
Register
Sign In
sleepy
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
Code
Issues
6
Pull Requests
Actions
182
Packages
Projects
Releases
Wiki
Activity
Files
4c61875bf86463326157130d8c1b6c1d642e131b
llama.cpp
/
tools
/
server
/
webui
/
docs
/
flows
T
History
Aleksander Grygier
4c61875bf8
webui: Add switcher to Chat Message UI to show raw LLM output (
#19571
)
2026-02-12 19:55:51 +01:00
..
chat-flow.md
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
conversations-flow.md
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
data-flow-simplified-model-mode.md
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
data-flow-simplified-router-mode.md
Use OpenAI-compatible
/v1/models
endpoint by default (
#17689
)
2025-12-03 20:49:09 +01:00
database-flow.md
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
models-flow.md
Use OpenAI-compatible
/v1/models
endpoint by default (
#17689
)
2025-12-03 20:49:09 +01:00
server-flow.md
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
settings-flow.md
webui: Add switcher to Chat Message UI to show raw LLM output (
#19571
)
2026-02-12 19:55:51 +01:00