Dennis Sedov
Dennis Sedov
Yes. I have exactly the same problem. What helped me is deleting config files
How exactly do you switch whole training / finetuning pipeline to FP16 or BF16. I can't fine any arguments that the optimizer take? I tried converting all the weights after...
I'm trying to train an LLM using this model: https://github.com/ml-explore/mlx-examples/blob/main/llms/llama/llama.py I've duplicated the code and added this to convert the weights: def convert_to_f16(self): self.apply(self.weigths_to_f16) def weigths_to_f16(self, m): m = m.astype(dtype=mx.float16)...