private-gpt
private-gpt copied to clipboard
Changing embedding parameters
is there any way to change embedding context/ chunk size? i'm using ollama and the terminal has n_ctx=2048 when the model i'm using supports up to 4096
Right now, we cannot modify the embedding context using Ollama. You will have to configure it by modifying embedding_component as this PR modified llm_component respectively:
https://github.com/zylon-ai/private-gpt/pull/1703/files#diff-d1cc2631298b50677e869ca40d96be3d748f912661d694b916f3a99b5827fdf9R118