Logo
Explore Help
Register Sign In
sleepy/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
Code Issues 16 Pull Requests Actions 30 Packages Projects Releases Wiki Activity
Files
c2101a2e909ac7c08976d414e64e96c90ee5fa9e
llama.cpp/examples/server/tests/features/steps
T
History
Pierrick Hymbert 76e868821a server: metrics: add llamacpp:prompt_seconds_total and llamacpp:tokens_predicted_seconds_total, reset bucket only on /metrics. Fix values cast to int. Add Process-Start-Time-Unix header. (#5937)
Closes #5850
2024-03-08 12:25:04 +01:00
..
steps.py
server: metrics: add llamacpp:prompt_seconds_total and llamacpp:tokens_predicted_seconds_total, reset bucket only on /metrics. Fix values cast to int. Add Process-Start-Time-Unix header. (#5937)
2024-03-08 12:25:04 +01:00
Powered by Gitea Version: 1.26.1 Page: 272ms Template: 22ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API