LLM / Artificial Intelligence trained on my own BookStack library in order to find what past notes?
Describe the feature you'd like
Are there any plans to integrate an LLM into BookStack? It might sound like an insane, but it could be useful to locate information in one’s BookStack library or to refresh ones memory on what’s there. Think of using it for instance to create recaps of books or pages.
Describe the benefits this would bring to existing BookStack users
I'm collecting so much information and have so many projects going on, I find it increasingly difficult to keep track of my own recordings.
I assume many other have similar problems.
Can the goal of this request already be achieved via other means?
Only by manually browsing or searching for the right keywords.
Have you searched for an existing open/closed issue?
- [x] I have searched for existing issues and none cover my fundamental request
How long have you been using BookStack?
Over 5 years
Additional context
No response
Hi @Wookbert,
I started on a proof of concept of a somewhat native integration of LLM based search back in March. My draft PR branch with research can be found in #5552. I just added a video preview of my proof of concept in a comment there to provide some visuals: https://github.com/BookStackApp/BookStack/pull/5552#issuecomment-2884752947
There's quite a few questionables & considerations around this though. It's on pause right now, while I crack on with the next feature release, but my plan was to come back to it to develop it out a little further after this current release cycle is done.
Hello!
From what I've heard; The bigger LLMs are based on various calculations by using the GPU rather than the CPU. I actually currently don't even have a GPU in my homelab server.
It would be great if things like LLM is configurable, so that its not even in use in the first place - unless it is configured.
Hi @Kristoffeh, We probably wouldn't ship models directly part of BookStack at all, and this would be something that's optional upon the default system due to the external requirements.
My current implementation uses OpenAI-like APIs, which others seem to support as a somewhat non-official standard. The idea is you'd be able to integration with an external system of your choice which supports this API, including a self-hosted instance of something like Ollama using self-hosted models (potentially on another system) or just existing LLM services.
Hi @Kristoffeh, We probably wouldn't ship models directly part of BookStack at all, and this would be something that's optional upon the default system due to the external requirements.
My current implementation uses OpenAI-like APIs, which others seem to support as a somewhat non-official standard. The idea is you'd be able to integration with an external system of your choice which supports this API, including a self-hosted instance of something like Ollama using self-hosted models (potentially on another system) or just existing LLM services.
Hello again @ssddanbrown and thanks for taking the time. Okay, that sounds great!
https://docs.onyx.app/connectors/bookstack An opensource project Onyx(previously Danswer) already have bookstack as a connector. I am using this from the past 1 year and its good. we can configure the update/delete intervals, agents based on different documents sets etc. sharing here if it is useful.
https://docs.onyx.app/connectors/bookstack An opensource project Onyx(previously Danswer) already have bookstack as a connector. I am using this from the past 1 year and its good. we can configure the update/delete intervals, agents based on different documents sets etc. sharing here if it is useful.
Does Onyx support book/document Access Control?