whisper.cpp/examples/talk-llama
Georgi Gerganov 4a0deb8b1e
talk-llama : add new example + sync ggml from llama.cpp (#664)
* talk-llama : talk with LLaMA AI

* talk.llama : disable EOS token

* talk-llama : add README instructions

* ggml : fix build in debug
2023-03-27 21:00:32 +03:00
..
.gitignore talk-llama : add new example + sync ggml from llama.cpp (#664) 2023-03-27 21:00:32 +03:00
CMakeLists.txt talk-llama : add new example + sync ggml from llama.cpp (#664) 2023-03-27 21:00:32 +03:00
llama.cpp talk-llama : add new example + sync ggml from llama.cpp (#664) 2023-03-27 21:00:32 +03:00
llama.h talk-llama : add new example + sync ggml from llama.cpp (#664) 2023-03-27 21:00:32 +03:00
README.md talk-llama : add new example + sync ggml from llama.cpp (#664) 2023-03-27 21:00:32 +03:00
speak.sh talk-llama : add new example + sync ggml from llama.cpp (#664) 2023-03-27 21:00:32 +03:00
talk-llama.cpp talk-llama : add new example + sync ggml from llama.cpp (#664) 2023-03-27 21:00:32 +03:00

talk-llama

Talk with an LLaMA AI in your terminal

Demo Talk

Building

The talk-llama tool depends on SDL2 library to capture audio from the microphone. You can build it like this:

# Install SDL2 on Linux
sudo apt-get install libsdl2-dev

# Install SDL2 on Mac OS
brew install sdl2

# Build the "talk-llama" executable
make talk-llama

# Run it
./talk-llama -mw ./models/ggml-small.en.bin -ml ../llama.cpp/models/13B/ggml-model-q4_0.bin -p "Georgi" -t 8
  • The -mw argument specifies the Whisper model that you would like to use. Recommended base or small for real-time experience
  • The -ml argument specifies the LLaMA model that you would like to use. Read the instructions in https://github.com/ggerganov/llama.cpp for information about how to obtain a ggml compatible LLaMA model

TTS

For best experience, this example needs a TTS tool to convert the generated text responses to voice. You can use any TTS engine that you would like - simply edit the speak.sh script to your needs. By default, it is configured to use MacOS's say, but you can use whatever you wish.