whisper.cpp/examples/talk
Syahmi Azhar 1512545149
whisper : add loader class to allow loading from buffer and others (#353)
* whisper : add loader to allow loading from other than file

* whisper : rename whisper_init to whisper_init_from_file

* whisper : add whisper_init_from_buffer

* android : Delete local.properties

* android : load models directly from assets

* whisper : adding <stddef.h> needed for size_t + code style

Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
2023-01-08 13:03:33 +02:00
..
.gitignore talk : talk with AI in the terminal 2022-12-10 16:51:58 +02:00
CMakeLists.txt cmake : update to 3.19 (#351) 2023-01-05 21:22:48 +02:00
gpt-2.cpp examples : fix memory leak on failure to load gpt2 model (#323) 2022-12-23 20:19:07 +02:00
gpt-2.h talk : talk with AI in the terminal 2022-12-10 16:51:58 +02:00
README.md talk : improve prompting 2022-12-12 23:44:36 +02:00
speak.sh talk : talk with AI in the terminal 2022-12-10 16:51:58 +02:00
talk.cpp whisper : add loader class to allow loading from buffer and others (#353) 2023-01-08 13:03:33 +02:00

talk

Talk with an Artificial Intelligence in your terminal

Demo Talk

Web version: examples/talk.wasm

Building

The talk tool depends on SDL2 library to capture audio from the microphone. You can build it like this:

# Install SDL2 on Linux
sudo apt-get install libsdl2-dev

# Install SDL2 on Mac OS
brew install sdl2

# Build the "talk" executable
make talk

# Run it
./talk -p Santa

GPT-2

To run this, you will need a ggml GPT-2 model: instructions

Alternatively, you can simply download the smallest ggml GPT-2 117M model (240 MB) like this:

wget --quiet --show-progress -O models/ggml-gpt-2-117M.bin https://huggingface.co/datasets/ggerganov/ggml/raw/main/ggml-model-gpt-2-117M.bin

TTS

For best experience, this example needs a TTS tool to convert the generated text responses to voice. You can use any TTS engine that you would like - simply edit the speak.sh script to your needs. By default, it is configured to use espeak, but you can use whatever you wish.