weissenbacherpwc

Results 7 comments of weissenbacherpwc

any experiences on how to enable Streaming with llamacpp and FastAPI/Langserve?

@cam-barts This helped a lot! What is confusiong to me is the filestore of "fs" and vectorstore. E.g. my code here with using Chroma as vectorstore: ``` def run_db_build(): loader...

Got it with the directories, thanks! In the script it would be the easiest way to use the created `big_chunks_retriever` in the RetrievalQA chain. But if I think if one...

> > > @weissenbacherpwc correct! I usually have different ingestion and use scripts, but the way that you build the retriever will be the same. So in one part of...

> I could make llamacpp work with langserve by applying #9177 (or #10908) and adding parameter `chunk=chunk` to `run_manager.on_llm_new_token()` in `_astream()`. Can you share your complete solution? I am also...

> my branch thanks @akionux ! Does this ony work with Langserve or could this also work only with FastAPI?

also looking for such a function.