This website requires JavaScript.
Explore
Help
Register
Sign In
sleepy
/
llama.cpp
Watch
1
Star
0
Fork
0
You've already forked llama.cpp
Code
Issues
8
Pull Requests
1
Actions
175
Packages
Projects
Releases
Wiki
Activity
Files
51a48720b8e46f4173a9535dc895e45e800f18b3
llama.cpp
/
tools
/
server
/
webui
/
src
T
History
Xuan-Son Nguyen
51a48720b8
webui: fix prompt progress ETA calculation (
#18468
)
...
* webui: fix prompt progress ETA calculation * handle case done === 0
2025-12-29 21:42:11 +01:00
..
lib
webui: fix prompt progress ETA calculation (
#18468
)
2025-12-29 21:42:11 +01:00
routes
webui: apply webui_settings on first load (
#18223
)
2025-12-23 15:48:03 +01:00
styles
feat(webui): improve LaTeX rendering with currency detection (
#16508
)
2025-11-03 00:41:08 +01:00
app.css
server: introduce API for serving / loading / unloading multiple models (
#17470
)
2025-12-01 19:41:04 +01:00
app.d.ts
webui: Fix selecting generated output issues during active streaming (
#18091
)
2025-12-18 11:13:52 +01:00
app.html
SvelteKit-based WebUI (
#14839
)
2025-09-17 19:29:13 +02:00