optimum icon indicating copy to clipboard operation
optimum copied to clipboard

Support longformer in ORTModel

Open fxmarty opened this issue 2 years ago • 1 comments

Feature request

Longformer takes global_attention_mask as input in the current transformers onnx export. Hence, it is currently not supported with ORTModel.

Motivation

Before going forward, it could be good to benchmark longformer with transformers onnx export vs the custom https://github.com/microsoft/onnxruntime/tree/main/onnxruntime/python/tools/transformers/models/longformer

Your contribution

Can see how we can support, but I'm worried adding custom cases like this to ORTModel adds overhead.

fxmarty avatar Nov 17 '22 10:11 fxmarty

As an alternative, is that possible/reasonable to use other model's config to optimize longformer? such as put

 {
  "model_type": "bert"
}

in config.json for optimize longformer? Thanks @fxmarty

AdriandLiu avatar Jan 18 '24 20:01 AdriandLiu