Knowledge-QA-LLM
Knowledge-QA-LLM copied to clipboard
QA based on local knowledge and LLM.
📣 We're looking for front-end development engineers interested in Knowledge QA with LLM, who can help us achieve front-end and back-end separation with our current implementation.
Introduction
- Questions & Answers based on local knowledge base + LLM.
- Reason:
- The idea of this project comes from Langchain-Chatchat.
- I have used this project before, but it is not very flexible and deployment is not very friendly.
- Learn from the ideas in How to build a knowledge question answering system with a large language model, and try to use this as a practice.
- Advantage:
- The whole project is modularized and does not depend on the
lanchain
library, each part can be easily replaced, and the code is simple and easy to understand. - In addition to the large language model interface that needs to be deployed separately, other parts can use CPU.
- Support documents in common formats, including
txt, md, pdf, docx, pptx, excel
etc. Of course, other types of documents can also be customized and supported.
- The whole project is modularized and does not depend on the
Demo
⚠️ If you have Baidu Account, you can visit the online demo based on ERNIE Bot.

Documentation
Full documentation can be found on docs, in Chinese.
TODO
- [ ] Support keyword + vector hybrid search.
- [ ] Vue.js based UI .
Code Contributors
Contributing
- Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
- Please make sure to update tests as appropriate.
Sponsor
If you want to sponsor the project, you can directly click the Buy me a coffee image, please write a note (e.g. your github account name) to facilitate adding to the sponsorship list below.