llama.cpp/examples
Sergey Alirzaev cc9cee8e9e
Do not crash when it has nothing to say. (#796)
Otherwise observing this in the interactive mode:
/usr/lib/gcc/x86_64-pc-linux-gnu/12/include/g++-v12/bits/stl_vector.h:1230: reference std::vector<int>::back() [_Tp = int, _Alloc = std::allocator<int>]: Assertion '!this->empty()' failed.
2023-04-06 17:59:11 +02:00
..
embedding llama : fix linkage with mingw (#551) 2023-03-28 21:23:09 +03:00
main Do not crash when it has nothing to say. (#796) 2023-04-06 17:59:11 +02:00
perplexity llama : fix linkage with mingw (#551) 2023-03-28 21:23:09 +03:00
quantize Fix ggml_init_params in quantize 2023-03-30 12:28:25 -07:00
alpaca.sh Move chat scripts into "./examples" 2023-03-25 20:37:09 +02:00
chat-13B.bat Create chat-13B.bat (#592) 2023-03-29 20:21:09 +03:00
chat-13B.sh Move chat scripts into "./examples" 2023-03-25 20:37:09 +02:00
chat.sh If n_predict == -1, generate forever 2023-03-25 21:51:41 +02:00
CMakeLists.txt Overhaul the examples structure 2023-03-25 20:26:40 +02:00
common.cpp fix default params for examples/main (#697) 2023-04-02 04:41:12 +02:00
common.h main.cpp fixes, refactoring (#571) 2023-03-28 17:09:55 +03:00
gpt4all.sh examples : add gpt4all script (#658) 2023-04-02 10:56:20 +03:00
Miku.sh miku.sh : add executable bit (#780) 2023-04-05 18:59:13 +03:00
reason-act.sh add example of re-act pattern (#583) 2023-03-29 10:10:24 -05:00