LLM split functionality into related add-ons
The llm add-on should really just provide the integration with all of the LLMs supported. The actual ZAP integrations should be moved to be sub-extensions of the relevant add-ons, e.g.
- alertfilter
- openapi
Will start looking at splitting Alert Review functionality into Alert Filter add-on.
Better to start laying out the foundation to do so.
Could you be more specific?
Actually, I'll look at it. I'm guessing you mean there are generic things that should be exposed via the Extension (class) that'd make life easier for other extensions/add-ons to use or consume.
How's this sound as the initial implementation?
- Allow to get the options set.
- Allow to check isConfigured.
- Allow to get a LlmCommunicationService and set their own LlmAssistant implementation (This part I'm not 100% sure about, but I "think" it makes sense.)
- Why allow to get the options set?
- Sounds good if done through the extension.
- That means we'll have to spill the
langchain4jimplementation, maybe that's fine. If we are doing that we could just add a factory for the "assistants".
- Why allow to get the options set?
Actually I guess if the other code can get a (or access to) LlmCommunicationService then it doesn't need access to the options.
- That means we'll have to spill the langchain4j implementation, maybe that's fine. If we are doing that we could just add a factory for the "assistants".
I'm open to ideas/suggestions.
I was just trying to picture moving all the Alert Review code out of the current add-on and into Alert Filter. It seemed/seems like:
- Most of the bits can be taken out of LlmCommunicationService and put into the Menu class or an Action class (or similar).
- It needs to be able to communicate and in order to communicate an Assistant needs to be established. (Assuming we keep the current mode of operation....)
- I guess another option would be to have the LLM add-on keep a registration map of assistants ... like Map.of("AlertReview", AlertReviewAssistant)....with AlertReviewAssistant be an LlmAssistant implementer (with the existing review methods and associated code).... Then Alert Filters could get ExtensionLlm and registerAssistant. I guess that still means in needs langchain4j for the annotations.....hrm
🤷♂️ I dunno, I'm far from a design guru 🤪