zaproxy icon indicating copy to clipboard operation
zaproxy copied to clipboard

LLM split functionality into related add-ons

Open psiinon opened this issue 8 months ago • 7 comments

The llm add-on should really just provide the integration with all of the LLMs supported. The actual ZAP integrations should be moved to be sub-extensions of the relevant add-ons, e.g.

  • alertfilter
  • openapi

psiinon avatar Jul 03 '25 14:07 psiinon

Will start looking at splitting Alert Review functionality into Alert Filter add-on.

kingthorin avatar Aug 11 '25 13:08 kingthorin

Better to start laying out the foundation to do so.

thc202 avatar Aug 11 '25 14:08 thc202

Could you be more specific?

kingthorin avatar Aug 11 '25 14:08 kingthorin

Actually, I'll look at it. I'm guessing you mean there are generic things that should be exposed via the Extension (class) that'd make life easier for other extensions/add-ons to use or consume.

kingthorin avatar Aug 11 '25 15:08 kingthorin

How's this sound as the initial implementation?

  • Allow to get the options set.
  • Allow to check isConfigured.
  • Allow to get a LlmCommunicationService and set their own LlmAssistant implementation (This part I'm not 100% sure about, but I "think" it makes sense.)

kingthorin avatar Aug 12 '25 10:08 kingthorin

  • Why allow to get the options set?
  • Sounds good if done through the extension.
  • That means we'll have to spill the langchain4j implementation, maybe that's fine. If we are doing that we could just add a factory for the "assistants".

thc202 avatar Aug 12 '25 16:08 thc202

  • Why allow to get the options set?

Actually I guess if the other code can get a (or access to) LlmCommunicationService then it doesn't need access to the options.

  • That means we'll have to spill the langchain4j implementation, maybe that's fine. If we are doing that we could just add a factory for the "assistants".

I'm open to ideas/suggestions.

I was just trying to picture moving all the Alert Review code out of the current add-on and into Alert Filter. It seemed/seems like:

  • Most of the bits can be taken out of LlmCommunicationService and put into the Menu class or an Action class (or similar).
  • It needs to be able to communicate and in order to communicate an Assistant needs to be established. (Assuming we keep the current mode of operation....)
    • I guess another option would be to have the LLM add-on keep a registration map of assistants ... like Map.of("AlertReview", AlertReviewAssistant)....with AlertReviewAssistant be an LlmAssistant implementer (with the existing review methods and associated code).... Then Alert Filters could get ExtensionLlm and registerAssistant. I guess that still means in needs langchain4j for the annotations.....hrm

🤷‍♂️ I dunno, I'm far from a design guru 🤪

kingthorin avatar Aug 12 '25 16:08 kingthorin