optimum
optimum copied to clipboard
Update: fix typo `from_pretained` to `from_pretrained`
What does this PR do?
Fix typo from_pretained to from_pretrained
Fixes # (issue) https://github.com/huggingface/optimum/issues/1088
Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you make sure to update the documentation with your changes?
- [x] Did you write any new necessary tests?
chk plz! @michaelbenayoun @jin
"maybe this is lookin small, but can made big wave...! π"
@YooSungHyun Let's talk about the issues here instead than #1088.
About quantization attribute not being a QuantizationConfig but a dictionary, you are right, and this is bad because it is actually documented as a QuantizationConfig.
You can also fix this here, by doing in optimum/onnxruntime/configuration.py:
def __init__(
self,
opset: Optional[int] = None,
use_external_data_format: bool = False,
one_external_file: bool = True,
optimization: Optional[OptimizationConfig] = None,
quantization: Optional[QuantizationConfig] = None,
**kwargs,
): -> None
super().__init__()
self.opset = opset
self.use_external_data_format = use_external_data_format
self.one_external_file = one_external_file
self.optimization = optimization
self.quantization = quantization
self.optimum_version = kwargs.pop("optimum_version", None)
Basically the only difference here is that we do not call the dataclass_to_dict method.
Then to be able to serialize ORTConfig, you will need to override the to_dict method:
def to_dict(self) -> Dict[str, Any]:
clone = copy.deepcopy(self) : Self@ORTConfig
clone.optimization = self.dataclass_to_dict(clone.optimization)
clone.quantization = self.dataclass_to_dict(clone.quantization) : dict[Unknown, Unknown]
return BaseConfig.to_dict(clone)
@michaelbenayoun you mean, convert to OptimizationConfig and QuantizationConfig in __init__(): and if when i needs some save_pretrained() kind of thing, then ORTConfig().to_dict() -> save_pretrained()??
i just change your suggest code and running, quantization is still dict. not QuantizationConfig dataclass
No I mean:
- No transforming the
optimizationandquantizationattributes to dictionaries inORTConfig.__init__. - Doing that will make the
ORTConfignot JSON-serializable, to make it serializable, just add theto_dictmethod I shared to theORTConfigclass.
first of all, i commit first. so let we talk with code conversation..!
plz, write down review
@michaelbenayoun first, i don't speak english as well. so i may not understand it very well..but, I want to contribute this code intensely..! so.... plz bear with meπ€
i will convert source code about ORTConfig.from_pretrained() -> ORTConfig.from_dict()
i will override PretrainedConfig.from_dict() in ORTConfig. so, It will just call ORTConfig.from_pretrained() then, optimization and quantization attribute will be each OptimizationConfig and Quantizationconfig
and this case, i think maybe needs to def to_dict(). so i just leave it.
i workin on it....π¦Ύ
@michaelbenayoun Ta-da! i've changed all my code. i think, before version source-code is two-step necessary.
- load all ort config
- make QuantizationConfig and this way, if someone needs OptimizationConfig, he or she must write down OptimizationConfig's load code. but, QuantizationConfig is not have any side-effect
but, look at after version...
just override from_dict for BaseConfig.from_pretrained(), it can will be load all each config classes (OptimizationConfig and QuantizationConfig) at once.
and also, using getattr, we can load exactly type (if is none, will be default value)
if, someone needs OptimizationConfig, he or she just write down under the if config.optimization:, or maybe all loaded well.
finally, at quantize.py it is more scalable, because just using ORTConfig is ORTConfig, not any additional change (what if using additional config value)
i deleted super.__init__() that is make not used additional parameter
and i think pruned_heads and some like label kind of things is not using ORTConfig, so deleted too.
this way, is using override, so i think have some side-effect problem. so code review is necessary
i tested it well done, and when if config json key missing, it will set default value... how about it?
is good i think. how about that?
I think all the code modifications are done.
@michaelbenayoun
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.
@fxmarty @michaelbenayoun π€
Thank you @YooSungHyun and apologizes for the delay, I'll give a review on Monday!
Can you run make style? The check_code_quality workflow does not pass currently.
@fxmarty @michaelbenayoun plz check...!π