models : change default hosting to Hugging Face

My Linode is running out of monthly bandwidth due to the big interest in
the project
This commit is contained in:
Georgi Gerganov
2022-11-15 19:47:06 +02:00
parent 83c742f1a7
commit 864a78a8d0
4 changed files with 24 additions and 12 deletions

View File

@@ -1,10 +1,13 @@
## Whisper model files in custom ggml format
The [original Whisper PyTorch models provided by OpenAI](https://github.com/openai/whisper/blob/main/whisper/__init__.py#L17-L27)
have been converted to custom `ggml` format in order to be able to load them in C/C++. The conversion has been performed using the
[convert-pt-to-ggml.py](convert-pt-to-ggml.py) script. You can either obtain the original models and generate the `ggml` files
yourself using the conversion script, or you can use the [download-ggml-model.sh](download-ggml-model.sh) script to download the
already converted models from https://ggml.ggerganov.com
have been converted to custom `ggml` format in order to be able to load them in C/C++. The conversion has been performed
using the [convert-pt-to-ggml.py](convert-pt-to-ggml.py) script. You can either obtain the original models and generate
the `ggml` files yourself using the conversion script, or you can use the [download-ggml-model.sh](download-ggml-model.sh)
script to download the already converted models. Currently, they are hosted on the following locations:
- https://huggingface.co/datasets/ggerganov/whisper.cpp
- https://ggml.ggerganov.com
Sample usage: