Edgar Pino
Edgar Pino
> @edgar971 please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information. > > ``` > @microsoft-github-policy-service agree [company="{your company}"] >...
@microsoft-github-policy-service agree
Yes, put the model in your local ./models folder like /models/my-own-llm-ggml-chat.bin' and update the environment variables in the docker compose.
For example: 1. Find a GGML model from HuggingFace or your own 2. Copy the download link if using a HuggingFace model. 3. Update the `docker-compose.yml` file. ```yml version: '3.6'...
Makes sense, you can set that as an env variable in the docker compose file or run.sh file.
This is a good idea. Any tools or post about this? I can take a look on how to implement this.
Maybe this https://gpt-index.readthedocs.io/en/latest/?