private-transformers
private-transformers copied to clipboard
Feature Request: Context Manager for training mixed private and public data
Hi,
I am training on multiple datasets, some private some public. I want to only use DP on the private samples. I could pass an indicator with my batches to indicate if it is a batch of private or public data. I would like to have:
if private_data:
with private_context(enabled=True):
training_step(batch)
else:
training_step(batch)
I think the only workaround I can think of is to use two different optimizers; one with the privacy engine and one without. Then I would load the state dict every time I switch between to keep the optimizer consistent.
How easy would it be to bake in a context manager?