OmniParser
OmniParser copied to clipboard
[without conda] python gradio_demo.py: requests.exceptions.HTTPError: 404 Client Error:
Python Version:
Python 3.12.9
PIP:
Package Version
------------------------- -----------
accelerate 1.4.0
aiofiles 23.2.1
aiohappyeyeballs 2.4.6
aiohttp 3.11.12
aiosignal 1.3.2
albucore 0.0.13
albumentations 1.4.10
altair 5.5.0
annotated-types 0.7.0
anthropic 0.46.0
anyio 3.7.1
astor 0.8.1
attrs 25.1.0
azure-core 1.32.0
azure-identity 1.20.0
beautifulsoup4 4.13.3
blinker 1.9.0
boto3 1.36.24
botocore 1.36.24
cachetools 5.5.1
certifi 2025.1.31
cffi 1.17.1
cfgv 3.4.0
charset-normalizer 3.4.1
click 8.1.8
comtypes 1.4.10
contourpy 1.3.1
cryptography 44.0.1
cycler 0.12.1
Cython 3.0.12
dashscope 1.22.1
decorator 5.1.1
defusedxml 0.7.1
dill 0.3.9
distlib 0.3.9
distro 1.9.0
easyocr 1.7.2
einops 0.8.0
fastapi 0.115.8
ffmpy 0.5.0
filelock 3.17.0
fire 0.7.0
fonttools 4.56.0
frozenlist 1.5.0
fsspec 2025.2.0
gitdb 4.0.12
GitPython 3.1.44
google-auth 2.38.0
gradio 5.13.2
gradio_client 1.6.0
groq 0.18.0
h11 0.14.0
httpcore 1.0.7
httpx 0.28.1
huggingface-hub 0.29.1
identify 2.6.7
idna 3.10
imageio 2.37.0
imgaug 0.4.0
iniconfig 2.0.0
Jinja2 3.1.5
jiter 0.8.2
jmespath 1.0.1
joblib 1.4.2
jsonschema 4.22.0
jsonschema-specifications 2024.10.1
kiwisolver 1.4.8
lazy_loader 0.4
lmdb 1.6.2
lxml 5.3.1
markdown-it-py 3.0.0
MarkupSafe 2.1.5
matplotlib 3.10.0
mdurl 0.1.2
MouseInfo 0.1.3
mpmath 1.3.0
msal 1.31.1
msal-extensions 1.2.0
multidict 6.1.0
narwhals 1.27.1
networkx 3.4.2
ninja 1.11.1.3
nodeenv 1.9.1
numpy 1.26.4
nvidia-cublas-cu12 12.4.5.8
nvidia-cuda-cupti-cu12 12.4.127
nvidia-cuda-nvrtc-cu12 12.4.127
nvidia-cuda-runtime-cu12 12.4.127
nvidia-cudnn-cu12 9.1.0.70
nvidia-cufft-cu12 11.2.1.3
nvidia-curand-cu12 10.3.5.147
nvidia-cusolver-cu12 11.6.1.9
nvidia-cusparse-cu12 12.3.1.170
nvidia-cusparselt-cu12 0.6.2
nvidia-nccl-cu12 2.21.5
nvidia-nvjitlink-cu12 12.4.127
nvidia-nvtx-cu12 12.4.127
openai 1.3.5
opencv-contrib-python 4.11.0.86
opencv-python 4.11.0.86
opencv-python-headless 4.11.0.86
opt-einsum 3.3.0
orjson 3.10.15
packaging 24.2
paddleocr 2.9.1
paddlepaddle 2.6.2
pandas 2.2.3
pillow 11.1.0
pip 25.0.1
platformdirs 4.3.6
pluggy 1.5.0
portalocker 2.10.1
pre-commit 3.8.0
propcache 0.2.1
protobuf 5.29.3
psutil 7.0.0
py-cpuinfo 9.0.0
pyarrow 19.0.1
pyasn1 0.6.1
pyasn1_modules 0.4.1
PyAutoGUI 0.9.54
pyclipper 1.3.0.post6
pycparser 2.22
pydantic 2.10.6
pydantic_core 2.27.2
pydeck 0.9.1
pydub 0.25.1
PyGetWindow 0.0.9
Pygments 2.19.1
PyJWT 2.10.1
PyMsgBox 1.0.9
pyparsing 3.2.1
pyperclip 1.9.0
PyRect 0.2.0
PyScreeze 1.0.1
pytest 8.3.3
pytest-asyncio 0.23.6
python-bidi 0.6.6
python-dateutil 2.9.0.post0
python-docx 1.1.2
python-multipart 0.0.20
python3-xlib 0.15
pytweening 1.2.0
pytz 2025.1
PyYAML 6.0.2
RapidFuzz 3.12.1
referencing 0.36.2
regex 2024.11.6
requests 2.32.3
rich 13.9.4
rpds-py 0.22.3
rsa 4.9
ruff 0.6.7
s3transfer 0.11.2
safehttpx 0.1.6
safetensors 0.5.2
scikit-image 0.25.2
scikit-learn 1.6.1
scipy 1.15.2
screeninfo 0.8.1
seaborn 0.13.2
semantic-version 2.10.0
setuptools 75.8.0
shapely 2.0.7
shellingham 1.5.4
six 1.17.0
smmap 5.0.2
sniffio 1.3.1
soupsieve 2.6
starlette 0.45.3
streamlit 1.42.1
supervision 0.18.0
sympy 1.13.1
tenacity 9.0.0
termcolor 2.5.0
threadpoolctl 3.5.0
tifffile 2025.2.18
timm 1.0.14
tokenizers 0.21.0
toml 0.10.2
tomli 2.2.1
tomlkit 0.13.2
torch 2.6.0
torchvision 0.21.0
tornado 6.4.2
tqdm 4.67.1
transformers 4.49.0
triton 3.2.0
typer 0.15.1
typing_extensions 4.12.2
tzdata 2025.1
uiautomation 2.0.20
ultralytics 8.3.70
ultralytics-thop 2.0.14
urllib3 2.3.0
uvicorn 0.34.0
virtualenv 20.29.2
watchdog 6.0.0
websocket-client 1.8.0
websockets 14.2
yarl 1.18.3
I am trying to run python gradio_demo.py
without using conda env but I am using pyenv virtualenv.
As a result, I stumble on the following errors messages:
2025-02-21 20:25:26,345] [ WARNING] easyocr.py:80 - Neither CUDA nor MPS are available - defaulting to CPU. Note: This module is much faster with a GPU.
Traceback (most recent call last):
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 409, in hf_raise_for_status
response.raise_for_status()
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/requests/models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/weights/icon_caption_florence/resolve/main/config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/transformers/utils/hub.py", line 342, in cached_file
resolved_file = hf_hub_download(
^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 862, in hf_hub_download
return _hf_hub_download_to_cache_dir(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 969, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1486, in _raise_on_head_call_error
raise head_call_error
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1376, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
^^^^^^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1296, in get_hf_file_metadata
r = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 280, in _request_wrapper
response = _request_wrapper(
^^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 304, in _request_wrapper
hf_raise_for_status(response)
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 458, in hf_raise_for_status
raise _format(RepositoryNotFoundError, message, response) from e
huggingface_hub.errors.RepositoryNotFoundError: 404 Client Error. (Request ID: Root=1-67b870bb-2ff0abd732ae6ac0004bfe11;166bac72-f4c7-4fb8-825c-99a99c427996)
Repository Not Found for url: https://huggingface.co/weights/icon_caption_florence/resolve/main/config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/dragon/Test_Projects/OmniParser/gradio_demo.py", line 16, in <module>
caption_model_processor = get_caption_model_processor(model_name="florence2", model_name_or_path="weights/icon_caption_florence")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/dragon/Test_Projects/OmniParser/util/utils.py", line 65, in get_caption_model_processor
model = AutoModelForCausalLM.from_pretrained(model_name_or_path, torch_dtype=torch.float32, trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 487, in from_pretrained
resolved_config_file = cached_file(
^^^^^^^^^^^^
File "/home/dragon/.pyenv/versions/omniparserv2/lib/python3.12/site-packages/transformers/utils/hub.py", line 365, in cached_file
raise EnvironmentError(
OSError: weights/icon_caption_florence is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`
I have tried running the huggingface-cli login
commands and passing the token
to it but the problem still remain the same.