VILA
VILA copied to clipboard
What're the modifications in `llava/train/transformers_replace`?
Hi, thanks for the nice work! I wonder what are the main modifications in llava/train/transformers_replace compared to the original implementation in transformers==4.31.0, as specified in the pyproject.toml. Also, in environment_setup.sh, transformers==4.36.2 is installed:
pip install git+https://github.com/huggingface/[email protected]
I wonder why we want to install different versions of transformers?
If I want to use a higher version of transformers, e.g. 4.38, are there changes needed for the files in this folder? Many thanks!
We have manually changed some original implementations to better support grouping strategy and flash attn, and recommend every VILA user to do so.
Though our codebase should work with higher version transformer, we haven't tested throughfully thus cannot promise anything. Please use v4.36.2 for reproducement.
I found "transformers_version": "4.38.1", in the config.json of VILA-2.7B. Which version of transformers should we use to run VILA-2.7B? By the way, is there any plan to allow loading VILA models using transformers API only without relying on the current repo?
I noticed that you mainly manually implemented class LlamaForCausalLM(LlamaPreTrainedModel) in I noticed that you mainly manually implemented class LlamaForCausalLM(LlamaPreTrainedModel) in replace. however, in the code (either train or infer), your implemented class is not be used. What's wrong with it?
Thanks for your reply
@Lyken17 : Could you please share some details for this ?
please use 4.36.2 for now, we will upgrade to 4.37 in next release :)
Also why manually replace the files? Why not use monkeypatching, or reregister your versions with transformers using exist_ok=True?
Also interested in understanding this better, as I'm trying to combine a few things in VILA and LLaVA-NeXt and this makes me concerned that something might break in an unexpected way (e.g. run but give poor results).
In environment_setup.sh, if we use transformers v4.37.2 but skip the following, what are the repercussions?
# What happens if we skip this?
cp -rv ./llava/train/transformers_replace/* $site_pkg_path/transformers/
cp -rv ./llava/train/deepspeed_replace/* $site_pkg_path/deepspeed/
Are there PR's for merging these into the respective libraries or do they break other things and thus can't be merged?
the issue has been non-active for a while. Feel free to reopen if the issue still exists