iRonaldo
iRonaldo
bert_encoder = build_transformer_model( checkpoint_path=checkpoint_path, config_path=config_path, return_keras_model=True) hidden_states_1,_ = bert_encoder([inputs[0], inputs[1]]) hidden_states_2,_ = bert_encoder([inputs[2], inputs[3]]) 想用两个输入通过同一个bert形成两个输出 这一步一直过不去 报错 Tensor objects are only iterable when eager execution is enabled. To iterate over...
### Is your feature request related to a problem? Please describe. _No response_ ### Solutions 我看GLM-130b提供了 faster transformer inference 的方法 如果chatGlm和那个模型一致 是不是可以用一样的工具转换一下提升推理速度? ### Additional context _No response_