ipex-llm icon indicating copy to clipboard operation
ipex-llm copied to clipboard

Is there a plan for the BigDL/PPML projects to support running XLM-RoBERTa large-XNLI within a TEE?

Open antchainmappic opened this issue 1 year ago • 1 comments

antchainmappic avatar Sep 12 '23 03:09 antchainmappic

XLM-RoBERTa large-XNLI can be loaded using transformers API as shown in https://huggingface.co/joeddav/xlm-roberta-large-xnli#with-manual-pytorch. You can have a quick try running it using the transformers API in bigdl-llm - simply change the import and set load_in_4bit=True when loading the model like below:

# import AutoXXX class from bigdl.llm.transformers instead of transformers, and set load_in_4bit=True in from_pretrained
from bigdl.llm.transformers import AutoModelForSequenceClassification, 
nli_model = AutoModelForSequenceClassification.from_pretrained('joeddav/xlm-roberta-large-xnli', load_in_4bit=True)

# import AutoTokenizer from transformers as usual
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('joeddav/xlm-roberta-large-xnli')

# following code remains the same
# ...                                                 

shane-huang avatar Sep 13 '23 01:09 shane-huang