llm-foundry
llm-foundry copied to clipboard
KeyError: 'attn_pdrop' with t5-small_dolly_sft.yaml when running inference/convert_composer_to_hf.py
I've setup the t5-small_dolly_sft.yaml file to with run_name: t5-small-dolly and ran composer train.py yamls/finetune/t5-small_dolly_sft.yaml train_loader.dataset.split=train to generate the check point.
from the scripts directory, I run:
python inference/convert_composer_to_hf.py \
--composer_path train/t5-small-dolly/checkpoints/ep1-ba235-rank0.pt \
--hf_output_path t5-small-dolly-hf \
--output_precision bf16
to convert the check point and get the error:
attn_config['attn_pdrop'] = hf_config_dict['attn_pdrop']
KeyError: 'attn_pdrop'
the hf_config_dict does not seem to have the appropriate keys here to continue. Is this the correct way to proceed or has there been a change to the expected values from hf_config_dict?
or is there a way to run it as is?