frankh077
frankh077
It is not possible to set `trainer.max_epochs` when you set `trainer.max_steps`, due to by default `trainer.max_epochs` is ignored. To set `trainer.max_epochs` you must do it without `trainer.max_steps` but by default...
I'm having the same problem for mistral 7B PEFT when running this command: ` python3 /opt/NeMo/scripts/checkpoint_converters/convert_mistral_7b_hf_to_nemo.py --input_name_or_path=/workspace/mistral-7B-hf --output_path=mistral.nemo ` This is the error: ` [NeMo I 2024-04-02 17:52:08 convert_mistral_7b_hf_to_nemo:149] loading...
@pradeepdev-1995 Yes but through the `nemo-framework-training` container, this is the command I used: `docker run --gpus device=1 --shm-size=2g --net=host --ulimit memlock=-1 --rm -it -v ${PWD}:/workspace -w /workspace -v ${PWD}/results:/results nvcr.io/nvaie/nemo-framework-training:23.08.03...
I think it is mandatory since the environment and the necessary tools are in the container, but if you can build the environment it should work, you can base on...