Logo
Explore Help
Register Sign In
sleepy/llama.cpp
1
0
Fork 0
You've already forked llama.cpp
Code Issues 16 Pull Requests Actions 31 Packages Projects Releases Wiki Activity
Files
e991e3127ff71a29e61fe1de5dd1cbd2e1df1858
llama.cpp/ggml/include
T
History
Diego Devesa e991e3127f llama : use smart pointers for ggml resources (#10117)
2024-11-01 23:48:26 +01:00
..
ggml-alloc.h
ggml : fix typo in example usage ggml_gallocr_new (ggml/984)
2024-10-04 18:50:05 +03:00
ggml-amx.h
add amx kernel for gemm (#8998)
2024-10-18 13:34:36 +08:00
ggml-backend.h
llama : refactor model loader with backend registry (#10026)
2024-10-30 02:01:23 +01:00
ggml-blas.h
ggml : add backend registry / device interfaces to BLAS backend (#9752)
2024-10-07 21:55:08 +02:00
ggml-cann.h
[CANN] Adapt to dynamically loadable backends mechanism (#9970)
2024-10-22 16:16:01 +08:00
ggml-cpp.h
llama : use smart pointers for ggml resources (#10117)
2024-11-01 23:48:26 +01:00
ggml-cuda.h
llama : refactor model loader with backend registry (#10026)
2024-10-30 02:01:23 +01:00
ggml-kompute.h
kompute: add backend registry / device interfaces (#10045)
2024-10-30 17:01:52 +01:00
ggml-metal.h
ggml : add metal backend registry / device (#9713)
2024-10-07 18:27:51 +03:00
ggml-rpc.h
rpc : add backend registry / device interfaces (#9812)
2024-10-10 20:14:55 +02:00
ggml-sycl.h
[SYCL] Add SYCL Backend registry, device and Event Interfaces (#9705)
2024-10-18 06:46:16 +01:00
ggml-vulkan.h
vulkan : add backend registry / device interfaces (#9721)
2024-10-17 02:46:58 +02:00
ggml.h
ggml : remove ggml_scratch (#10121)
2024-11-01 12:58:45 +02:00
Powered by Gitea Version: 1.26.1 Page: 881ms Template: 137ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API