Fix huggingface inference endpoint name
fix #998
the name provided during construction is populated by the call to super().__init__(), access to self attributes is required for any value populated by Configurable.
Verification
List the steps needed to make sure this thing works
- [ ] Execute against public InferenceAPI:
python -m garak --model_type huggingface.InferenceAPI --model_name microsoft/Phi-3-mini-4k-instruct --probes malwaregen.Evasion
- [ ] Verify valid responses are received from endpoint.
- [ ] Verify added automation tests pass
I tested the public endpoint referenced in the issue, I have not done an end to end test with a private endpoint.
Tested the fix and works for what we've been trying out. Also python3 -m garak --model_type huggingface.InferenceEndpoint --model_name https://api--inference.huggingface.co/models/microsoft/Phi-3-mini-4k-instruct --probes malwaregen.Evasion seems to do queries to the right place, so I would assume private ones would work too.
I guess (and seemed to work, tho it's getting late)
def __init__(name="")
super().__init__(name, config_root=config_root)
self.uri = ... + self.name
in both would make it a bit more easy to follow, the self.name = name confused me when trying to figure the code out, as I thought it implies the super() won't do anything to it :-)
@ppietikainen thanks for the extra testing, we have tried to add documentation on how Configurable classes config values are expected to be prioritized. The assignment of name as provided by to the constructor relates to how the precedence of values are treated allowing constructor values to be held above configuration file values that are injected by super().
Also, in the different classes name has competing usage. For InferenceAPI it is expected to just be the model name and gets added to the public base uri defined in a constant, in InferenceEndpoint the full uri must be supplied. I do think there may be some consistency to be gained at some point in the future. I don't have a chosen path forward in mind so for now this can land to at least return the functionality to original state before the regression.