datasets icon indicating copy to clipboard operation
datasets copied to clipboard

Cannot download Phoenix 2014 T dataset

Open Jai2500 opened this issue 1 year ago • 3 comments

Hello,

I'm unable to download the Phoenix 2014 T dataset with the mediapipe poses. It gives me a 403 Forbidden access error when I try to. Some help would be greatly appreciated.

Thanks, Jai

Jai2500 avatar Jan 23 '25 21:01 Jai2500

What exactly are you running? What exactly is the error? If I go to the URL mentioned in - https://github.com/sign-language-processing/datasets/blob/master/sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t.py#L39C27-L39C118 It works just fine

AmitMY avatar Jan 24 '25 08:01 AmitMY

So, the issue arises when I try to download the holistic poses. Attached below is the traceback:

data_l = tfds.load(name=config.dataset, builder_kwargs=dict(config=sd_config), data_dir=config.data_dir)[self.split]
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/logging/__init__.py", line 169, in __call__
    return function(*args, **kwargs)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/load.py", line 647, in load
    _download_and_prepare_builder(dbuilder, download, download_and_prepare_kwargs)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/load.py", line 506, in _download_and_prepare_builder
    dbuilder.download_and_prepare(**download_and_prepare_kwargs)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/logging/__init__.py", line 169, in __call__
    return function(*args, **kwargs)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/dataset_builder.py", line 699, in download_and_prepare
    self._download_and_prepare(
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/dataset_builder.py", line 1669, in _download_and_prepare
    split_infos = self._generate_splits(dl_manager, download_config)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/dataset_builder.py", line 1620, in _generate_splits
    split_generators = self._split_generators(  # pylint: disable=unexpected-keyword-arg
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/sign_language_datasets/datasets/rwth_phoenix2014_t/rwth_phoenix2014_t.py", line 92, in _split_generators
    downloads = dl_manager.download_and_extract(urls)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/download/download_manager.py", line 694, in download_and_extract
    return _map_promise(self._download_extract, url_or_urls)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/download/download_manager.py", line 791, in _map_promise
    res = tree.map_structure(lambda p: p.get(), all_promises)  # Wait promises
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tree/__init__.py", line 435, in map_structure
    [func(*args) for args in zip(*map(flatten, structures))])
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tree/__init__.py", line 435, in <listcomp>
    [func(*args) for args in zip(*map(flatten, structures))])
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/download/download_manager.py", line 791, in <lambda>
    res = tree.map_structure(lambda p: p.get(), all_promises)  # Wait promises
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/promise/promise.py", line 512, in get
    return self._target_settled_value(_raise=True)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/promise/promise.py", line 516, in _target_settled_value
    return self._target()._settled_value(_raise)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/promise/promise.py", line 226, in _settled_value
    reraise(type(raise_val), raise_val, self._traceback)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/six.py", line 719, in reraise
    raise value
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/promise/promise.py", line 844, in handle_future_result
    resolve(future.result())
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/download/downloader.py", line 296, in _sync_download
    with _open_url(url, verify=verify) as (response, iter_content):
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/contextlib.py", line 135, in __enter__
    return next(self.gen)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/download/downloader.py", line 369, in _open_with_requests
    _assert_status(response)
  File "/home2/jai.bardhan/micromamba/envs/llava/lib/python3.10/site-packages/tensorflow_datasets/core/download/downloader.py", line 396, in _assert_status
    raise download_utils_lib.DownloadError(
tensorflow_datasets.core.download.util.DownloadError: Failed to get url https://firebasestorage.googleapis.com/v0/b/sign-language-datasets/o/public%2Fphoenix-annotations.tar.gz?alt=media. HTTP code: 403.

When I check out the Firebase URL on my browser, it still gives the 403 forbidden error.

Jai2500 avatar Jan 24 '25 08:01 Jai2500

Please update to the latest version of the dataset. We moved to a different storage https://github.com/sign-language-processing/datasets/commit/9a9f2d94371bda64a0e4bcda21b8fab7d1805736

AmitMY avatar Jan 24 '25 16:01 AmitMY