llama.cpp/examples/perplexity
Georgi Gerganov f9a6364912
llama : require first token to be BOS (#1303)
* llama : require first token to be BOS

* scripts : add ppl-run-all.sh

* perplexity : add BOS for each chunk

* readme : update perplexity values after BOS fix

* perplexity : add clarifying comments
2023-05-08 17:41:54 +03:00
..
CMakeLists.txt Add git-based build information for better issue tracking (#1232) 2023-05-01 18:23:47 +02:00
perplexity.cpp llama : require first token to be BOS (#1303) 2023-05-08 17:41:54 +03:00
README.md Fix whitespace, add .editorconfig, add GitHub workflow (#883) 2023-04-11 19:45:44 +00:00

perplexity

TODO