Logo
Explore Help
Register Sign In
sleepy/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
Code Issues 16 Pull Requests Actions 224 Packages Projects Releases Wiki Activity
Files
fe00a84b4b972b3172b2c6b880954d81da532ca4
llama.cpp/tools/server/webui/tests/stories
T
History
Aleksander Grygier afa6bfe4f7 Pre-MCP UI and architecture cleanup (#19685)
* webui: extract non-MCP changes from mcp-mvp review split

* webui: extract additional pre-MCP UI and architecture cleanup

* chore: update webui build output
2026-02-17 13:47:45 +01:00
..
fixtures
server: introduce API for serving / loading / unloading multiple models (#17470)
2025-12-01 19:41:04 +01:00
ChatMessage.stories.svelte
webui: Add switcher to Chat Message UI to show raw LLM output (#19571)
2026-02-12 19:55:51 +01:00
ChatScreenForm.stories.svelte
Pre-MCP UI and architecture cleanup (#19685)
2026-02-17 13:47:45 +01:00
ChatSettings.stories.svelte
server: introduce API for serving / loading / unloading multiple models (#17470)
2025-12-01 19:41:04 +01:00
ChatSidebar.stories.svelte
server: introduce API for serving / loading / unloading multiple models (#17470)
2025-12-01 19:41:04 +01:00
Introduction.mdx
server: introduce API for serving / loading / unloading multiple models (#17470)
2025-12-01 19:41:04 +01:00
MarkdownContent.stories.svelte
webui: Architecture and UI improvements (#19596)
2026-02-14 09:06:41 +01:00
Powered by Gitea Version: 1.26.1 Page: 432ms Template: 2ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API