super-gradients
super-gradients copied to clipboard
Bug/sg 861 decouple qat from train from config
This PR decouples QAT/PTQ from train_from_config. The goal is to let users launch PTQ/QAT using Python instead of CLI + configs, while using as much defaults as possible. Note that automatic adaptation of the parameters using the best practices is the users responsibilty when they choose to launch this way, since the new "quantize" method I added can expect objects rather then parameters.
- Renamed train_from_config in QATTrainer to quantize_from_config.
- Introduced a "quantize" method.
- Added the option to pass little parameters as possible to let users use existing objects which are already set in train().
- I added a simple test that takes a model from train to QAT to our recipe test suite (modified th ci config to install pytorch quantisation beforehand - it is possible since these tests run on gpu on spot instances).
- Added alot of docs as there were barely any.