garak icon indicating copy to clipboard operation
garak copied to clipboard

Configurable `Generator` parameters

Open erickgalinkin opened this issue 1 year ago • 10 comments

We need a way to instantiate generators in a way that:

  • Accepts a device_map for multi-gpu systems
  • Accepts a dtype (so we can load torch.bfloat16, for example)

erickgalinkin avatar Apr 09 '24 15:04 erickgalinkin

As a note this kind of customization of the generator feels like something that should be standardized as config options that can be passed via the -G options similar to how rest options are passed.

jmartin-tech avatar Apr 09 '24 20:04 jmartin-tech

Link #554

jmartin-tech avatar Apr 09 '24 20:04 jmartin-tech

trying to get this running via the standard config. the current configs don't auto-populate plugin class values, only at module level, which makes this a pain. will get that working better in the plugin loader

leondz avatar Apr 10 '24 08:04 leondz

Oh cool (narrator voice: it was not, in fact, cool): generators aren't loaded with _plugins.load_plugin, or any of the usual plugin loading machinery, so their config never gets automatically set up. I did wonder how the plugin loader would handle generator instantiation params (just model_name).

Actions:

  • move generator loading to use plugin loading & configuation architecture - find a way to pass model_name at instantiation time (maybe a special case in load_plugin eh) - tracked in #594
  • update the configuration machinery to take class-level info as it looks in _plugins / _config. this is nested dataclasses then nested dicts - it's not completely palatable but is functional. prefer keeping the yaml format looking sensible - tracked in #595
  • will set up a hugging face mixin that loads huggingface module-level config into the HF classes, so we can carry on working while these two issues are addressed

leondz avatar Apr 10 '24 09:04 leondz

When we work out device_map - and this feels global rather than huggingface related, is that right? - we should

  • [ ] look for mentions of cuda / maybe just cuda: in the code and test that they're not overriding it
  • [ ] write test requesting cpu, cuda default device, and cuda specific non-first device (skipping if unavailable) and verifying that they're used

leondz avatar Apr 11 '24 20:04 leondz

seems like device_map is a hf-dependent concept, so _config.plugins.generators.huggingface makes more sense as a home

leondz avatar Apr 22 '24 13:04 leondz

I think from a generic point of view, we could have a the Configurable functionality in generators/base.py as a dict and provide validation of parameters in the class before consuming the values. Each generator can then determine how to handler the dict, in some cases by providing a set of expected valid keys or by just validating the keys and values are all in a format that will not present easy entry point for RCE based on generators configuration pattern.

jmartin-tech avatar Apr 22 '24 14:04 jmartin-tech

folding #594 into here, check out notes there

leondz avatar May 02 '24 11:05 leondz

folding #643 into here, check out notes there

leondz avatar May 02 '24 11:05 leondz

There's an example of generators loading centralised configs in RestGenerator; see https://github.com/leondz/garak/blob/075796f044c2877977383dcf561b5a54f865c009/garak/generators/rest.py#L113...L149

leondz avatar May 02 '24 11:05 leondz

Does this means we wont be able to run garak for any model that does not fit one single gpu until multi-gpu feature from this pull request is merged to main and release? @leondz

kbmlcoding avatar Jun 05 '24 13:06 kbmlcoding

You can hardcode for now, but #711 is expected to land this week which paves the way to exposing this.

leondz avatar Jun 05 '24 13:06 leondz

Awesome .. that is great @leondz ..where can i find the instructions to build garak locally on either mac/linux system if i were to make hardcoded change you suggested

Assuming all i need to do is : change https://github.com/leondz/garak/blob/7839ee33c5b2f6568204205086aff183653cc4f7/garak/generators/huggingface.py#L476 to
self.init_device = "auto"

kbmlcoding avatar Jun 05 '24 14:06 kbmlcoding