Thoonsen Maxime

Results 84 comments of Thoonsen Maxime

Hey @tonyabracadabra . Yes final state should be all database supported by TypeOrm. For the moment pg and sqlite. But it's not a lot of work to add one as...

Working on it here : https://github.com/hwchase17/langchainjs/pull/144

Hey @aidec , I could work on that. Do you have an example of a too large JSON that we use for testing? Can you provide the code you are...

Hey @Achalogy, can you provide some example of `texts`, `query` and what does the `this.getChatPrompt()` return ? Did you try to not pass any prompt to the chain like: `const...

Hey @Rishab1207 did it work for you? With a bit of code if you are still struggling: ```typescript const llmChain = new LLMChain({ llm: model, prompt: promptTemplateBot }) const stuffChain...

Hey @Rishab1207 , I just finished an entire [presentation on it](https://docs.google.com/presentation/d/1TMfaJwXg50n6aCT8wdee9hOMqgoLOoCWLJJ405bqBbk/edit?pli=1#slide=id.g249d4435930_0_381) Maybe this code will help you as well: ````typescript const model = new OpenAI({ modelName: 'gpt-3.5-turbo' }) const promptTemplateBot...

@Rishab1207 I have reread your question. You don't need to pass the {context} and {question} manually. The context is pull during the processing of the chain and the question when...

hey @Rishab1207, I haven't played much with history so I can't answer you. > How the prompt variables are automatically pulled in? I want to pass the conversation/conversation summary as...