llama.cpp/prompts
CRD716 b608b55a3e
prompts : model agnostic DAN (#1304)
* add model-agnostic dan prompt

* quick readme update

* save a token

* Revert "quick readme update"

This reverts commit 8dc342c069.
2023-05-11 18:10:19 +03:00
..
alpaca.txt Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982) 2023-04-14 22:58:43 +03:00
chat-with-bob.txt Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982) 2023-04-14 22:58:43 +03:00
chat-with-vicuna-v0.txt examples : read chat prompts from a template file (#1196) 2023-05-03 20:58:11 +03:00
chat-with-vicuna-v1.txt examples : read chat prompts from a template file (#1196) 2023-05-03 20:58:11 +03:00
chat.txt examples : read chat prompts from a template file (#1196) 2023-05-03 20:58:11 +03:00
dan-modified.txt prompts : model agnostic DAN (#1304) 2023-05-11 18:10:19 +03:00
dan.txt prompts : model agnostic DAN (#1304) 2023-05-11 18:10:19 +03:00
reason-act.txt do not force the prompt file to end with a new line (#908) 2023-04-13 11:33:16 +02:00