fabric
fabric copied to clipboard
[Feature request]: Handle Splitting?
What do you need?
It would be great if fabric could automatically handle splitting/chunking for text that is too large for a given model.
From what I understand, this would need:
- Information about the token limit of the model being used
- A way to count the tokens in a specific request
- A range of splitting options (character, word, sentence, recursive, semantic etc.) to choose from
- Possibly a way to select the LLM used for semantic splitting