Chakradhar Guntuboina
Results
2
comments of
Chakradhar Guntuboina
Ollama sets the [context length of all models to 2048 by default](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size), even if the model can support higher context length. Try manually increasing it and try again.
Hi, - To extract embeddings from the model, you should set `self.head` in `model/network.py` to `torch.nn.Identity()`. Can you kindly go through `tutorial.ipynb` and see if that answers your queries? If...