Logo
Explore Help
Register Sign In
sleepy/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
Code Issues 6 Pull Requests 1 Actions 188 Packages Projects Releases Wiki Activity
Files
fcf6538ba6702c55eaec70da9a75c81d04900a72
llama.cpp/requirements
T
History
Georgi Gerganov fabf30b4c4 llama : remove Persimmon (#7408)
* llama : remove Persimmon

* requirements : remove
2024-05-21 02:35:28 +10:00
..
requirements-convert-hf-to-gguf-update.txt
convert-hf : save memory with lazy evaluation (#7075)
2024-05-08 18:16:38 -04:00
requirements-convert-hf-to-gguf.txt
convert-hf : save memory with lazy evaluation (#7075)
2024-05-08 18:16:38 -04:00
requirements-convert-llama-ggml-to-gguf.txt
python : add check-requirements.sh and GitHub workflow (#4585)
2023-12-29 16:50:29 +02:00
requirements-convert.txt
convert-hf : save memory with lazy evaluation (#7075)
2024-05-08 18:16:38 -04:00
Powered by Gitea Version: 1.26.1 Page: 119ms Template: 4ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API