llama.cpp/prompts
2023-09-14 12:32:10 -04:00
..
alpaca.txt Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982) 2023-04-14 22:58:43 +03:00
chat-with-baichuan.txt feature : support Baichuan serial models (#3009) 2023-09-14 12:32:10 -04:00
chat-with-bob.txt Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982) 2023-04-14 22:58:43 +03:00
chat-with-vicuna-v0.txt examples : read chat prompts from a template file (#1196) 2023-05-03 20:58:11 +03:00
chat-with-vicuna-v1.txt examples : read chat prompts from a template file (#1196) 2023-05-03 20:58:11 +03:00
chat.txt examples : read chat prompts from a template file (#1196) 2023-05-03 20:58:11 +03:00
dan-modified.txt prompts : model agnostic DAN (#1304) 2023-05-11 18:10:19 +03:00
dan.txt prompts : model agnostic DAN (#1304) 2023-05-11 18:10:19 +03:00
reason-act.txt do not force the prompt file to end with a new line (#908) 2023-04-13 11:33:16 +02:00