mindspore-nlp-tutorial icon indicating copy to clipboard operation
mindspore-nlp-tutorial copied to clipboard

5-1.Transformer raise error

Open Jeffyang123 opened this issue 3 years ago • 1 comments

mindspore 1.8.1 how to slove the problem?

from mindspore import context context.set_context(mode=context.GRAPH_MODE)

net_with_criterion = WithLossCell(model, criterion) train_network = nn.TrainOneStepCell(net_with_criterion, optimizer) train_network.set_train()

Training

for epoch in range(20): # hidden : [num_layers * num_directions, batch, hidden_size] loss = train_network(enc_inputs, dec_inputs, target_batch.view(-1)) print('Epoch:', '%04d' % (epoch + 1), 'cost =', '{:.6f}'.format(loss.asnumpy()))


RuntimeError Traceback (most recent call last) /tmp/ipykernel_1841/2954476418.py in 9 for epoch in range(20): 10 # hidden : [num_layers * num_directions, batch, hidden_size] ---> 11 loss = train_network(enc_inputs, dec_inputs, target_batch.view(-1)) 12 print('Epoch:', '%04d' % (epoch + 1), 'cost =', '{:.6f}'.format(loss.asnumpy()))

/usr/local/python-3.7.5/lib/python3.7/site-packages/mindspore/nn/cell.py in call(self, *args, **kwargs) 576 logger.warning(f"For 'Cell', it's not support hook function in graph mode. If you want to use hook " 577 f"function, please use context.set_context to set pynative mode.") --> 578 out = self.compile_and_run(*args) 579 return out 580

/usr/local/python-3.7.5/lib/python3.7/site-packages/mindspore/nn/cell.py in compile_and_run(self, *inputs) 963 """ 964 self._auto_parallel_compile_and_run = True --> 965 self.compile(*inputs) 966 967 new_inputs = []

/usr/local/python-3.7.5/lib/python3.7/site-packages/mindspore/nn/cell.py in compile(self, *inputs) 936 if self._dynamic_shape_inputs is None or self._dynamic_shape_inputs[0] is None: 937 _cell_graph_executor.compile(self, *inputs, phase=self.phase, auto_parallel_mode=self._auto_parallel_mode, --> 938 jit_config_dict=self._jit_config_dict) 939 else: 940 self._check_compile_dynamic_shape(*inputs)

/usr/local/python-3.7.5/lib/python3.7/site-packages/mindspore/common/api.py in compile(self, obj, phase, do_convert, auto_parallel_mode, jit_config_dict, *args) 1135 if jit_config_dict: 1136 self._graph_executor.set_jit_config(jit_config_dict) -> 1137 result = self._graph_executor.compile(obj, args_list, phase, self._use_vm_mode()) 1138 obj.compile_cache.add(phase) 1139 if not result:

RuntimeError: Parent func graph should be handled in advance, fg: ◀Equal.125, context: {FuncGraph: ◀Equal.125 Args: [0]: AbstractTensor(shape: (1, 5), element: AbstractScalar(Type: Float32, Value: AnyValue, Shape: NoShape), value_ptr: 0x5613f0692bd0, value: AnyValue), Parent: {FuncGraph: ▶Equal.53 Args: [0]: AbstractTensor(shape: (1, 5), element: AbstractScalar(Type: Int32, Value: AnyValue, Shape: NoShape), value_ptr: 0x5613f0692bd0, value: AnyValue), [1]: AbstractTensor(shape: (), element: AbstractScalar(Type: Int32, Value: AnyValue, Shape: NoShape), value_ptr: 0x561472ca8d70, value: Tensor(shape=[], dtype=Int32, value=0)), Parent: { Args: }}}, parent context: {FuncGraph: ▶Equal.53 Args: [0]: AbstractTensor(shape: (1, 5), element: AbstractScalar(Type: Int32, Value: AnyValue, Shape: NoShape), value_ptr: 0x5613f0692bd0, value: AnyValue), [1]: AbstractTensor(shape: (), element: AbstractScalar(Type: Int32, Value: AnyValue, Shape: NoShape), value_ptr: 0x561472ca8d70, value: Tensor(shape=[], dtype=Int32, value=0)), Parent: { Args: }}


  • C++ Call Stack: (For framework developers)

mindspore/ccsrc/pipeline/jit/static_analysis/program_specialize.cc:343 FuncGraphSpecializer

Jeffyang123 avatar Sep 24 '22 04:09 Jeffyang123

This is a bug on 1.8.1, it will be fixed in MindSpore 1.9.0(planed to release at 09/30)

lvyufeng avatar Sep 24 '22 15:09 lvyufeng

The problem has been fix on MindSpore 1.9.0

lvyufeng avatar Nov 02 '22 17:11 lvyufeng