optimum-intel
optimum-intel copied to clipboard
Infer if the model needs to be exported
As https://github.com/huggingface/optimum-intel/pull/722/ this PR remove the need to specify if the model needs to be exported or not with export
from optimum.intel import OVModelForCausalLM
- model = OVModelForCausalLM.from_pretrained("gpt2", export=True)
+ model = OVModelForCausalLM.from_pretrained("gpt2")
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.