zyzhang1130

Results 39 comments of zyzhang1130

when I did ``` model.save_pretrained("my-awesome-model") tokenizer.save_pretrained("my-awesome-model_tokenizer") ``` to save my model trained from `andreaskoepf/pythia-1.4b-gpt4all-pretrain` instead, here are the files I got: ![image](https://github.com/artidoro/qlora/assets/36942574/a31bd48b-7171-4572-8ee9-cbb3ac232db6) is it normal that the saved model and...

> yes, this just saves the set of adapters. you'll have to merge it after loading the model again. may I ask how this differs from how we usually load...

> LGTM. Please update the English/Chinese tutorial `tutorial/204-service.html` Done.

> LGTM, please also update the English&Chinese version of tutorial `tutorial/204-service.html` Done.

> Modified into [WIP], and we will review this PR after merging the `https://github.com/modelscope/agentscope/pull/332` Removed [WIP], as `https://github.com/modelscope/agentscope/pull/332` has been merged. This PR has been updated according to `https://github.com/modelscope/agentscope/pull/332`. The...

@BenjaminBossan I see. Appreciate the in-depth explanation!

Should we just use `formatting_func` and 'data_collator' instead? It seems together they serve the same purpose as setup_chat_format, except when you want to have system prompt in your training data...

> Hi @zyzhang1130 We do have tests that runs saving tests on `AutoModelWithValueHeadxxx`: > > https://github.com/huggingface/trl/blob/13454d2f4b243b7260fa4ec828297812c3f975fc/tests/test_modeling_value_head.py#L102 > but here it seems you are using PeftModel interface. Can you elaborate a...