mm-cot icon indicating copy to clipboard operation
mm-cot copied to clipboard

Hugging Face spiece.model private

Open pksekar opened this issue 2 years ago • 2 comments

Unable to run the command for rationale generation for the inference, getting the below error.

raise RepositoryNotFoundError( transformers.utils.hub.RepositoryNotFoundError: 401 Client Error: Repository not found for url: https://huggingface.co/models/rationale/resolve/main/spiece.model. If the repo is private, make sure you are authenticated.

raise EnvironmentError( OSError: models/rationale is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True

pksekar avatar Feb 23 '23 06:02 pksekar

I also encountered this problem. How to solve it?

soga2015 avatar Feb 25 '23 01:02 soga2015

Firstly, check that models are saved locally as per the readme instructions.

Secondly, if you are running the run_inference.sh, this file seems to be outdated. The values of --evaluate_dir params in there differ from the same in readme.

So, this change to the run_inference.sh file should cure the issue: --evaluate_dir models/rationale -> --evaluate_dir models/MM-CoT-UnifiedQA-base-Rationale --evaluate_dir models/answer -> --evaluate_dir models/MM-CoT-UnifiedQA-base-Answer

Alternatively, renaming subfolders reversely should also work.

I'll submit a pr for this.

igor-cheb avatar Feb 28 '23 02:02 igor-cheb