mlc-llm
mlc-llm copied to clipboard
How to implement text embeddings stored in vector database on ios device?
For large custom data stored on device (pdf, text, xml), how to turn custom large data into text embeddings and store them locally on device via vector database. Then perform vector search and submit results along with input query to llm for tasks such as summarization, question/answer, etc. ?
Have you found any information on how to do this?
Any updates on this?
Will mlc-llm support this use-case, or should we not wait for that?