Aleksandr Lifanov
Aleksandr Lifanov
like StarCoder with langhcain to replace OpenAI function calling
I had a similar issue The biggest delay was in response from OpenAI API coz it waits the whole answer (whole generation) You could try to use stream=True to get...
I've tried to use OpenAI API from scratch - it was also slow for big texts in response
https://python.langchain.com/en/latest/reference/modules/chat_models.html?highlight=streaming#langchain.chat_models.ChatOpenAI.streaming
@aphrodite1028 was it successful?
> > @aphrodite1028 was it successful? > > yes, it works for me, but needs some development, thanks could you provider some guide or how to for implementing it?
Try to remove .cache folder. Worked for me
@GrandNative could you describe it or get example? I've faced with the same issue