ragflow
ragflow copied to clipboard
[Question]: Upload a 4kb txt document, analyze it overnight, spend 30 million tokens, split over 3000 fragments, stuck at 99.7%.
Describe your problem
Is it locally deployed or could you share a sample of this text file? It's weird 4K text file will generate 3K chunks. I guess it stucked with ES indexing. 3000 chunks means that it will call LLM 3K times. I suggest to enlarge chunk token number in configuration. BTW, You can choose other LLM suppliers.