Files
llama.cpp/src
compilade bb4f7a9e4e memory : fix broken batch splits for recurrent cache (#14575)
Splits producing more than one ubatch per batch for recurrent models
were broken with #14512.

This fixes it by moving the completeness check after the ubatch split loop.
2025-07-08 18:37:47 +03:00
..
2025-07-08 11:24:06 +03:00
2025-07-08 11:24:06 +03:00
2025-07-08 11:24:06 +03:00