etyhh
etyhh
Hi, Error log as below: __________________________________________________________________________________________________ Starting training. Performing evaluation. loss Tensor("transducer/dense_1/BiasAdd:0", shape=(None, None, None, 3971), dtype=float32, device=/job:localhost/replica:0/task:0/device:GPU:0) Tensor("dist_inputs_4:0", shape=(None, None), dtype=int32) Tensor("Cast:0", shape=(None,), dtype=int32, device=/job:localhost/replica:0/task:0/device:GPU:0) Tensor("dist_inputs_3:0", shape=(None,), dtype=int32) Fatal...
Ziya is a large-scale pre-trained model based on LLaMA with 13 billion parameters.It add Chinese token. And model vocab size is not equal to tokenizer. Could anyone give some hint...