Support latest Transformers and new cache design
from transformers.utils import WEIGHTS_NAME, WEIGHTS_INDEX_NAME, cached_path, hf_bucket_url ImportError: cannot import name 'cached_path' from 'transformers.utils fails with transformers 4.23.1 :(
path = snapshot_download(
repo_id=model_name,
allow_patterns=["*"],
local_files_only=is_offline_mode(),
cache_dir=os.getenv("TRANSFORMERS_CACHE", None)
)
How about something like this ^^?
from transformers.utils import WEIGHTS_NAME, WEIGHTS_INDEX_NAME, cached_path, hf_bucket_url ImportError: cannot import name 'cached_path' from 'transformers.utils fails with transformers 4.23.1 :(
path = snapshot_download( repo_id=model_name, allow_patterns=["*"], local_files_only=is_offline_mode(), cache_dir=os.getenv("TRANSFORMERS_CACHE", None) )How about something like this ^^?
Thanks for the suggestion. The implementation I chose is similar to this. If you have a chance, try it out and let me know if you run into any bugs :)
I have tested this @mrwyattii and it works fine. One thing to note is that: earlier I had to pass the path as: TRANSFORMERS_CACHE/models-bigscience-bloom and now it is just: TRANSFORMERS_CACHE.
I would say the newer change you have made is much better and intuitive. Thanks for this PR. :)
I would say, after a few versions, we can drop support for older transformers maybe? I don't really think its needed since I think there is only a handful of people using MII for BLOOM :)
Can we merge this?
Can we merge this?
yes, we'll get this merged very soon! sorry for the delay