llama.cpp/examples/server/public
Jhen-Jie Hong 29674ab4e8
server : display token probabilities in the UI (#2489)
* server : add n_probs param in chat UI

* server : keep message data array & show in probabilites component

* server : add simple popover component

* server : fix completion_probabilities undefined if not set n_probs

* server : implement Probabilites

* server : handle bytes

* server : make n_probs max to 10 for easy scroll

* server : adjust for dark/light mode

* server : Fix regenerated prompt

* server : update index.html.hpp

* server : convert prob to percentage + show original value as div title

* server : fix Probabilites not used if included empty str

* server : skip byte pair in display probabilites

* server : remove array check of completion_probabilities in messages

* skip empty array or byte pair (> 1) in Probabilites

* generate index.html.hpp

* fix incorrect prob convert if the str is already a known token

* use final response to show probabilities on stop

* revert unnecessary change

* correct probabilites usage

* remove unused function

* always send partial response for get correct probs of last to_send

* fix typo

* fix content of format_final_response

* refactor probs render & make pColor transparent if not found

* send empty string when got stop_pos in partial

* avoid unnecessary empty data event & send rest of partial tokens on stop

* use <br /> for new line

* skip -1 tok in loop to avoid send '' on end

* trim last new lines on stop

* revert unnecessary change
2023-08-25 18:32:45 +08:00
..
completion.js Fixing race condition in server and partial stream handling in frontend. (#2391) 2023-08-04 13:37:24 +02:00
index.html server : display token probabilities in the UI (#2489) 2023-08-25 18:32:45 +08:00
index.js server : implement json-schema-to-grammar.mjs & add grammar param in the UI (#2588) 2023-08-14 15:16:54 +08:00
json-schema-to-grammar.mjs server : implement json-schema-to-grammar.mjs & add grammar param in the UI (#2588) 2023-08-14 15:16:54 +08:00