VisualGLM-6B
VisualGLM-6B copied to clipboard
Where can I get the file visualglm-6b/300/mp_rank_00_model_states.pt
I run the q-lora fine tuning script and get the following error
File "/home/z47xu/Code/VisualGLM/VisualGLM-6B/finetune_visualglm.py", line 180, in
I am wondering whether to download the file mp_rank_00_model_states.pt.
Your help is much appreciated.
Should I do something like this?
from transformers import AutoTokenizer, AutoModel, AutoConfig from sat.training.model_io import save_checkpoint
config = AutoConfig.from_pretrained("THUDM/chatglm2-6b", trust_remote_code=True)
args = argparse.Namespace( num_layers=config.num_layers, vocab_size=config.vocab_size, hidden_size=config.hidden_size, num_attention_heads=config.num_attention_heads, max_sequence_length=config.max_sequence_length, bos_token_id=tokenizer.bos_token_id, mask_token_id=tokenizer.mask_token_id, gmask_token_id=tokenizer.gmask_token_id, hidden_dropout=0., attention_dropout=0., inner_hidden_size=None, hidden_size_per_attention_head=None, checkpoint_activations=True, checkpoint_num_layers=1, layernorm_order='post', model_parallel_size=1, world_size=1, rank=0, skip_init=False, use_gpu_initialization=True, save='model_check_point', deepspeed=None, mode='inference', tokenizer_type="THUDM/chatglm-6b")
from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("visualglm-6b", trust_remote_code=True) model = AutoModel.from_pretrained("visualglm-6b",trust_remote_code=True).half().quantize(8).cuda()
save_checkpoint(1, model, None, None, args)
We don't have visualglm-6b/300. I don't know what code you are running.
We don't have visualglm-6b/300. I don't know what code you are running.
I ran the q-lora fine tuning script. bash finetune/finetune_visualglm_qlora.sh and got the above error.
I don't know why. But our fine tuning script will never request visualglm-6b/300 files. I can only assume that you have changed the code or folder structure, which is out of my control.