llama.cpp/examples/quantize-stats
Didzis Gosko 527b6fba1d
llama : make model stateless and context stateful (llama_state) (#1797)
* llama : make model stateless and context stateful

* llama : minor cleanup

* llama : update internal API declaration

* Apply suggestions from code review

fix style

Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>

* Missing model memory release

* Fix style

* Add deprecated warning for public API function llama_init_from_file

* Update public API use cases: move away from deprecated llama_init_from_file

* Deprecate public API function llama_apply_lora_from_file

---------

Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
2023-06-24 11:47:58 +03:00
..
CMakeLists.txt Add quantize-stats command for testing quantization (#728) 2023-04-08 00:09:18 +02:00
quantize-stats.cpp llama : make model stateless and context stateful (llama_state) (#1797) 2023-06-24 11:47:58 +03:00