langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Add 'truncate' parameter for CohereEmbeddings

Open ephe-meral opened this issue 2 years ago • 0 comments

Currently, the 'truncate' parameter of the cohere API is not supported.

This means that by default, if trying to generate and embedding that is too big, the call will just fail with an error (which is frustrating if using this embedding source e.g. with GPT-Index, because it's hard to handle it properly when generating a lot of embeddings). With the parameter, one can decide to either truncate the START or END of the text to fit the max token length and still generate an embedding without throwing the error.

In this PR, I added this parameter to the class.

Arguably, there should be a better way to handle this error, e.g. by optionally calling a function or so that gets triggered when the token limit is reached and can split the document or some such. Especially in the use case with GPT-Index, its often hard to estimate the token counts for each document and I'd rather sort out the troublemakers or simply split them than interrupting the whole execution. Thoughts?

ephe-meral avatar Jan 29 '23 18:01 ephe-meral