fully-local-pdf-chatbot
fully-local-pdf-chatbot copied to clipboard
Allow us to specify which Ollama model to use
I don't think there is a hard requirement to only use Mistral. Can we have a feature to use any Ollama model? I would like to use llama3 or phi3 for example.
Yes would be nice! As a short term workaround you can modify the source code.
Yes, I did that. Thanks!
Can you show me where is I can change source code?
Here:
https://github.com/jacoblee93/fully-local-pdf-chatbot/blob/main/components/ChatWindow.tsx#L116