Bernardo Costa
Bernardo Costa
I have added this important feature to my larger pull request (my first one ever). I gave you credit there, but no sure this is the right way to do...
Check my pull request (#45) where the "chunk_size" and "max_new_tokens", have been updated, among other improvements.
I get this error too, when I use HuggingFace: `embeddings = HuggingFaceInstructEmbeddings(model_name="hkunlp/instructor-xl")` `llm = HuggingFaceHub(repo_id="google/flan-t5-xxl", model_kwargs={"temperature":0.5, "max_length":512})` But everything is fine when using OpenAI: `embeddings = OpenAIEmbeddings()` `llm = ChatOpenAI()`
Solution for me: pip install InstructorEmbedding sentence_transformers huggingface-hub (explained here https://youtu.be/dXxQ0LR-3Hg?si=hhCtkHBvJsu6VO1V&t=2258) (you can also just go to the requirements.txt and uncomment the respective packages)
Thank you for your contribution @Rajat4Mahajan! I have added it to my pull request ( #45 )