alpaca-lora icon indicating copy to clipboard operation
alpaca-lora copied to clipboard

Completed the fine tuning with 443bytes (empty) adapter_model.bin as model file

Open aaronliruns opened this issue 2 years ago • 4 comments

I tried with many fine tune trainding data files with valid format but always got 443 bytes adapter_model.bin model file. All training epoches were successfully run using 1 x A100 GPU (Colab Pro +) Have anyone seen the same issue? What could be the cause?

Thanks

Aaron

aaronliruns avatar May 10 '23 16:05 aaronliruns

I've been experiencing the same issue. The get_peft_model_state_dict seems to be returning an empty dict -> empty model file.

to_return = {
            k: v
            for k, v in to_return.items()
            if (("lora_" in k and adapter_name in k) or ("bias" in k))
        }

seems to be filtering the result set to empty. lora_ is in the keys but adapter_name is never there adapter_name=default example key=base_model.model.model.layers.31.self_attn.v_proj.lora_B.weight

No idea about the cause yet

EDIT: Seems to be a known issue. Check e.g. https://github.com/tloen/alpaca-lora/issues/319

haapanen avatar May 10 '23 16:05 haapanen

Had the same problem. Commented these lines:

https://github.com/tloen/alpaca-lora/blob/8bb8579e403dc78e37fe81ffbb253c413007323f/finetune.py#L263-L269

Seems to work.

Seems like the save method already calls this method:

https://github.com/huggingface/peft/blob/4fd374e80d670781c0d82c96ce94d1215ff23306/src/peft/peft_model.py#L122-L130

florianjuengermann avatar May 11 '23 07:05 florianjuengermann

You meant to say to comment them? they are actually not commented.

https://github.com/tloen/alpaca-lora/blob/8bb8579e403dc78e37fe81ffbb253c413007323f/finetune.py#L263-L269

Yes I see the lines of code you pointed in peft saves the model to the output directory.

THanks

On Thu, May 11, 2023 at 3:57 PM Florian Juengermann < @.***> wrote:

Had the same problem. Uncommented these lines:

https://github.com/tloen/alpaca-lora/blob/8bb8579e403dc78e37fe81ffbb253c413007323f/finetune.py#L263-L269

Seems to work.

Seems like the save method already calls this method:

https://github.com/huggingface/peft/blob/4fd374e80d670781c0d82c96ce94d1215ff23306/src/peft/peft_model.py#L122-L130

— Reply to this email directly, view it on GitHub https://github.com/tloen/alpaca-lora/issues/446#issuecomment-1543518261, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADQ4N6T3ZESJ276GIKTSR7DXFSLXLANCNFSM6AAAAAAX47EATY . You are receiving this because you authored the thread.Message ID: @.***>

aaronliruns avatar May 11 '23 08:05 aaronliruns

Duplicate of https://github.com/tloen/alpaca-lora/issues/334

aaronliruns avatar May 12 '23 04:05 aaronliruns