llama.cpp/awq-py/requirements.txt

3 lines
34 B
Plaintext
Raw Normal View History

torch>=2.1.1
transformers>=4.32.0