mm-cot
mm-cot copied to clipboard
Hugging Face spiece.model private
Unable to run the command for rationale generation for the inference, getting the below error.
raise RepositoryNotFoundError( transformers.utils.hub.RepositoryNotFoundError: 401 Client Error: Repository not found for url: https://huggingface.co/models/rationale/resolve/main/spiece.model. If the repo is private, make sure you are authenticated.
raise EnvironmentError(
OSError: models/rationale is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token
or log in with huggingface-cli login
and pass use_auth_token=True
I also encountered this problem. How to solve it?
Firstly, check that models are saved locally as per the readme instructions.
Secondly, if you are running the run_inference.sh, this file seems to be outdated. The values of --evaluate_dir
params in there differ from the same in readme.
So, this change to the run_inference.sh file should cure the issue:
--evaluate_dir models/rationale
-> --evaluate_dir models/MM-CoT-UnifiedQA-base-Rationale
--evaluate_dir models/answer
-> --evaluate_dir models/MM-CoT-UnifiedQA-base-Answer
Alternatively, renaming subfolders reversely should also work.
I'll submit a pr for this.