tang-ed
tang-ed
bow_loss稳定在2左右,一直下不去
lm_loss都下到0.001左右了
With the update iteration of the version, some tokenizers are no longer used. My transformers version is 4.11.0
You can find the data set yourself. I think the open source open domain chat data set of Tsinghua University is good.
xxxxxxxx xxxxxxxxxx xxxxxxxxxx xxxxx xxxxxx xxxxxxxx xxxxx xxxxxxx xxxxxxxxxx xxxxx .........
Did you change the code? If so, please send it to have a look. Because of the original code, I run well. It should also be noted that tokenizer is...
That shouldn't be. My guess is that there is a problem with your programming environment, because I didn't see the complete error report. You can try to run it in...
Thank you for your use. This project has been updated. You can try again to see if there is a similar situation.
This is what I ran on a paid platform using Resnet as a case study
I am able to run any pytorch program normally, except for the example provided by Colossal AI