LLaDA
LLaDA copied to clipboard
Could you let me know when the LLaDA 1B model will be released?
i was gonna ask this and see ur issue lol
if 1b can't be released whats the reason:?
Thanks for your interest!
We have an LLaDA model with 1 billion parameters, which has been trained on 400 billion tokens and there is no annealing stage. This is a semi-finished product that has not been fully trained and is only used for exploratory experiments. Therefore, we have no plan to open-source this model.