torchmetrics icon indicating copy to clipboard operation
torchmetrics copied to clipboard

stabilize Transformer testing

Open Borda opened this issue 3 years ago • 2 comments

🚀 Feature

Find a way how to stabilize tests using Transformers, so far they are quite flaky while failing for random connection issues, so it makes all our testing nondeterministic :(

Motivation

Make tests with Transformers or replace their models in testing

Alternatives

we already have in configuration

TRANSFORMERS_CACHE=.cache/huggingface/

but it is a secondary option and it always asks for a new model if possible... on the other hand, using

TRANSFORMERS_OFFLINE=1

fails as seem that the TRANSFORMERS_CACHE is not properly utilized

Additional context

part of #678 and partially attested in #802 cc: @nateraw @stancld

Borda avatar Apr 07 '22 02:04 Borda

Hi @Borda, I encountered the same issue on another project yesterday. At this moment, I'm not aware of any proper solution, but as a short-term fix I currently use a wrapper like this:

def skip_on_connection_issues(
    exception=OSError, reason: str = "Unable to load checkpoints from HuggingFace `transformers`."
):
    def test_decorator(function: Callable, *args: Any, **kwargs: Any) -> Optional[Callable]:
        @wraps(function)
        def run_test(*args: Any, **kwargs: Any) -> Optional[Any]:
            try:
                return function(*args, **kwargs)
            except exception:
                pytest.skip(reason)

        return run_test

    return test_decorator

i.e. tests run normally if a HF checkpoint is successfully retrieved, and they're marked as skipped if this procedure fails.

edit: Maybe pytest.mark.xfail can handle this too.

stancld avatar Apr 07 '22 06:04 stancld

that could be interesting hotfix, mind send a PR with this wrapper and lest explicitly set the exception types :)

Borda avatar Apr 07 '22 06:04 Borda