langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

Feat/openai n

Open jacoblee93 opened this issue 10 months ago • 3 comments

@davidfant @functorism this is roughly what it would take to have the same featureset as .batch (separate runs per input, with error handling).

I think the maintenance overhead would be tough - would you all feel comfortable calling .generate() directly instead?

jacoblee93 avatar Apr 13 '24 00:04 jacoblee93

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchainjs-api-refs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 13, 2024 0:51am
langchainjs-docs ✅ Ready (Inspect) Visit Preview Apr 13, 2024 0:51am

vercel[bot] avatar Apr 13 '24 00:04 vercel[bot]

@jacoblee93 hmm, agree that this doesn't look clean on the ChatOpenAI-side. is there any other good way to accomplish this kind of batching without hacking the batch fn and without using generate? the problem with generate is that I want to use the same interface for across my app when doing LLM calls with an arbitrary BaseChatModel (mostly Claude and GPT-4). i want to avoid if OpenAI, generate in a special way

davidfant avatar Apr 18 '24 08:04 davidfant

Yeah I gotcha - unfortunately I think nothing comes to mind CC @baskaryan @eyurtsev :(

I think for now you could wrap it in a custom function? I hear you on wanting a unified interface for sure though.

jacoblee93 avatar Apr 22 '24 20:04 jacoblee93