llama.cpp/.devops
2023-07-11 19:12:35 +03:00
..
full-cuda.Dockerfile docker : add support for CUDA in docker (#1461) 2023-07-07 21:25:25 +03:00
full.Dockerfile Add llama.cpp docker support for non-latin languages (#1673) 2023-06-08 00:58:53 -07:00
main-cuda.Dockerfile docker : add support for CUDA in docker (#1461) 2023-07-07 21:25:25 +03:00
main.Dockerfile Add llama.cpp docker support for non-latin languages (#1673) 2023-06-08 00:58:53 -07:00
tools.sh docker : add '--server' option (#2174) 2023-07-11 19:12:35 +03:00