coremltools icon indicating copy to clipboard operation
coremltools copied to clipboard

Converting Hugging Face model to CoreML: Torch var attention_mask.1 not found in context

Open sidyakinian opened this issue 3 years ago • 1 comments

Issue

I'm trying to make a conversion Hugging Face pre-trained transformer -> TorchScript -> CoreML. I'm following the Hugging Face documentation to convert the transformer to TorchScript .pt model, but then get an error when converting TorchScript model to CoreML as per CoreMLTools documentation.

Code

Most of it is HF code for exporting a model to TorchScript.

# Hugging Face documentation

from transformers import BertModel, BertTokenizer, BertConfig
import torch

enc = BertTokenizer.from_pretrained("bert-base-uncased")

# Tokenizing input text
text = "[CLS] Who was Jim Henson ? [SEP] Jim Henson was a puppeteer [SEP]"
tokenized_text = enc.tokenize(text)

# Masking one of the input tokens
masked_index = 8
tokenized_text[masked_index] = "[MASK]"
indexed_tokens = enc.convert_tokens_to_ids(tokenized_text)
segments_ids = [0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1]

# Creating a dummy input
tokens_tensor = torch.tensor([indexed_tokens])
segments_tensors = torch.tensor([segments_ids])
dummy_input = [tokens_tensor, segments_tensors]

# Initializing the model with the torchscript flag
# Flag set to True even though it is not necessary as this model does not have an LM Head.
config = BertConfig(
    vocab_size_or_config_json_file=32000,
    hidden_size=768,
    num_hidden_layers=12,
    num_attention_heads=12,
    intermediate_size=3072,
    torchscript=True,
)

# Instantiating the model
model = BertModel(config)

# Set model to evaluation mode
model.eval()

# If you are instantiating the model with *from_pretrained* you can also easily set the TorchScript flag
model = BertModel.from_pretrained("bert-base-uncased", torchscript=True)

# Creating the trace
traced_model = torch.jit.trace(model, dummy_input)
torch.jit.save(traced_model, "traced_bert.pt")

# End of HF documentation
# Converting to CoreML
import coremltools as ct
mlmodel = ct.convert("traced_bert.pt",
                    inputs=[ct.TensorType(shape=(2, 1, 14))]) # This produces an error

Output of last two lines

WARNING:root:Tuple detected at graph output. This will be flattened in the converted model.
Converting Frontend ==> MIL Ops:   3%|▏    | 20/651 [00:00<00:00, 4774.66 ops/s]

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
/var/folders/8l/zrc6s_ln7xxch0gn0f06lz680000gn/T/ipykernel_36483/67818705.py in <module>
----> 1 mlmodel = ct.convert("traced_bert.pt",
      2                     inputs=[ct.TensorType(shape=(2, 1, 14))])

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/_converters_entry.py in convert(model, source, inputs, outputs, classifier_config, minimum_deployment_target, convert_to, compute_precision, skip_model_load, compute_units, useCPUOnly, package_dir, debug)
    350             raise Exception("If package_dir is provided, it must have extension {} (not {})".format(_MLPACKAGE_EXTENSION, ext))
    351 
--> 352     mlmodel = mil_convert(
    353         model,
    354         convert_from=exact_source,

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/converter.py in mil_convert(model, convert_from, convert_to, compute_units, **kwargs)
    181         See `coremltools.converters.convert`
    182     """
--> 183     return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
    184 
    185 

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/converter.py in _mil_convert(model, convert_from, convert_to, registry, modelClass, compute_units, **kwargs)
    208         _os.chmod(weights_dir, _stat.S_IRWXU | _stat.S_IRWXG | _stat.S_IRWXO)
    209 
--> 210     proto, mil_program = mil_convert_to_proto(
    211                             model,
    212                             convert_from,

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/converter.py in mil_convert_to_proto(model, convert_from, convert_to, converter_registry, **kwargs)
    271     frontend_converter = frontend_converter_type()
    272 
--> 273     prog = frontend_converter(model, **kwargs)
    274 
    275     if convert_to.lower() != "neuralnetwork":

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/converter.py in __call__(self, *args, **kwargs)
    103         from .frontend.torch import load
    104 
--> 105         return load(*args, **kwargs)
    106 
    107 

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/load.py in load(model_spec, debug, **kwargs)
     45     cut_at_symbols = kwargs.get("cut_at_symbols", None)
     46     converter = TorchConverter(torchscript, inputs, outputs, cut_at_symbols)
---> 47     return _perform_torch_convert(converter, debug)
     48 
     49 

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/load.py in _perform_torch_convert(converter, debug)
     82 def _perform_torch_convert(converter, debug):
     83     try:
---> 84         prog = converter.convert()
     85     except RuntimeError as e:
     86         if debug and "convert function" in str(e):

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/converter.py in convert(self)
    248 
    249             # Add the rest of the operations
--> 250             convert_nodes(self.context, self.graph)
    251 
    252             graph_outputs = [self.context[name] for name in self.graph.outputs]

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/ops.py in convert_nodes(context, graph)
     87                 "PyTorch convert function for op '{}' not implemented.".format(node.kind)
     88             )
---> 89         add_op(context, node)
     90 
     91         # We've generated all the outputs the graph needs, terminate conversion.

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/ops.py in _slice(context, node)
   3195 @register_torch_op(torch_alias=["slice"])
   3196 def _slice(context, node):
-> 3197     inputs = _get_inputs(context, node, expected=5)
   3198     x = inputs[0]
   3199     dim = inputs[1].val

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/ops.py in _get_inputs(context, node, expected, min_expected)
    185     value of @expected.
    186     """
--> 187     inputs = [context[name] for name in node.inputs]
    188     if expected is not None:
    189         expected = [expected] if not isinstance(expected, (list, tuple)) else expected

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/ops.py in <listcomp>(.0)
    185     value of @expected.
    186     """
--> 187     inputs = [context[name] for name in node.inputs]
    188     if expected is not None:
    189         expected = [expected] if not isinstance(expected, (list, tuple)) else expected

/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/coremltools/converters/mil/frontend/torch/converter.py in __getitem__(self, torch_name)
     77             if torch_name in current_graph:
     78                 return self._current_graph[idx][torch_name]
---> 79         raise ValueError(
     80             "Torch var {} not found in context {}".format(torch_name, self.name)
     81         )

ValueError: Torch var attention_mask.1 not found in context 

Versions

python==3.9.13
torch==1.12.0
transformers==4.16.2
coremltools=5.2.0

sidyakinian avatar Jul 02 '22 19:07 sidyakinian

Our support for Torch Script is experimental. If you can trace your model, please do that.

Is this still an issue if you use our latest beta release? You can install the beta by running: pip install coremltools --pre -U

TobyRoseman avatar Jul 05 '22 19:07 TobyRoseman

Since we have not heard back here, I'm going to close this issue.

TobyRoseman avatar Nov 01 '22 16:11 TobyRoseman

I'm encountering the same issue, using the latest coremltools.

Janus289 avatar Jan 30 '23 21:01 Janus289

Resolved; needed to pass two arguments in the input shape list.

Janus289 avatar Feb 01 '23 14:02 Janus289

Janus289

Hi, i am facing the same issue here. Could you please show how do you resolve the issue? An example would be better. I am a little confused about passing what parameter to the 'inputs' in ct.convert() function.

yqrickw20 avatar Feb 06 '23 08:02 yqrickw20

https://developer.apple.com/forums/thread/682408

Give a try ! @yqrickw20

Has anyone able to do this for a larger language model such as GLM, or Bloom etc..

abhishekmamdapure avatar Feb 09 '23 11:02 abhishekmamdapure