llmflows
llmflows copied to clipboard
refactor: why not just using containers?
https://github.com/stoyan-stoyanov/llmflows/blob/6ce79a8cc3b06e799d2b6eb6f6e8e78354f68dcd/llmflows/llms/openai_embeddings.py#L66-L68
If the whole point of having VectorDoc
vs List[VectorDoc]
is to put the first into a list and for the generated output to be accessed at index 0, why don't you make this function accept a container type?
If the answer is: because it has to match the function signature for the interface, I already see this does not happen - see #17
because the embeddings class can generate embeddings in a batch
I think I don't understand the motivation, perhaps it is something related to the ML magic. However, I would recommend finding a generalised behaviour for this method.
the openAI API can generate embeddings for a single string as well as a list of strings. So I thought this class being wrapper to this API should have the same behavior. Any ideas for a generalized behavior? By the way did you had a chance to look over the VectorDoc class? I am more nervous about its behavior.
@inspiralpatterns @vickychudinov I would love to get a bit more feedback on the usage of the VectorDoc and VectorStore classes - especially around the fact that as it is right now the user creates a vector doc, but then the vector doc has to get it's embeddings from a VectorStore. Please take a look at examples 12_ and 13_ and the Vector Stores, and Question Answering sections in the User guide if you have some time ❤️