GLM
GLM copied to clipboard
HuggingFace module
I read your paper with great interest. You seem to have a lot of novel ideas about how to improve the pretraining. Some of the scores are really impressive. I would like to test some of these ideas on other corpuses.
Have you considered making the code available as a HuggingFace module (TensorFlow/PyTorch/Flax)? I think this would lead to a lot more people looking into your ideas.
@peregilk Do you try it?
Any update on this? I notice that there is a out of box pretrain version for GLM-10B. Would like to know whether there are any future plan on uploading other pretrain model (e.g. GLM-10B-Chinese)?
Any update on this? I notice that there is a out of box pretrain version for GLM-10B. Would like to know whether there are any future plan on uploading other pretrain model (e.g. GLM-10B-Chinese)?
@siriusctrl We uploaded GLM-10B-Chinese and GLM-Large Chinese. Welcome to try! https://huggingface.co/BAAI/glm-10b-chinese https://huggingface.co/BAAI/glm-large-chinese