This website requires JavaScript.
Explore
Help
Register
Sign In
sleepy
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
Code
Issues
16
Pull Requests
Actions
133
Packages
Projects
Releases
Wiki
Activity
Files
33a56f90a6a793a3c7b1f6ca39ff43a1cecd0b61
llama.cpp
/
tools
/
server
/
webui
/
tests
/
stories
T
History
Aleksander Grygier
4c61875bf8
webui: Add switcher to Chat Message UI to show raw LLM output (
#19571
)
2026-02-12 19:55:51 +01:00
..
fixtures
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
ChatForm.stories.svelte
Webui/file upload (
#18694
)
2026-01-09 16:45:32 +01:00
ChatMessage.stories.svelte
webui: Add switcher to Chat Message UI to show raw LLM output (
#19571
)
2026-02-12 19:55:51 +01:00
ChatSettings.stories.svelte
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
ChatSidebar.stories.svelte
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
Introduction.mdx
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
MarkdownContent.stories.svelte
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00