Akio Nishimura

Results 3 comments of Akio Nishimura

I could make llamacpp work with langserve by applying #9177 (or #10908) and adding parameter `chunk=chunk` to `run_manager.on_llm_new_token()` in `_astream()`.

> > I could make llamacpp work with langserve by applying #9177 (or #10908) and adding parameter `chunk=chunk` to `run_manager.on_llm_new_token()` in `_astream()`. > > Can you share your complete solution?...

> thanks @akionux ! Does this ony work with Langserve or could this also work only with FastAPI? I checked the langserve playground works, so it may work with FastAPI.