great_expectations icon indicating copy to clipboard operation
great_expectations copied to clipboard

Unset project as cleanup in empty_data_context

Open tyler-hoffman opened this issue 9 months ago • 1 comments

The value from set_project persists between tests, which means that many of our tests only work because some other previous test had called set_project (likely indirectly via a fixture)

  • [ ] Description of PR changes above includes a link to an existing GitHub issue
  • [ ] PR title is prefixed with one of: [BUGFIX], [FEATURE], [DOCS], [MAINTENANCE], [CONTRIB]
  • [ ] Code is linted - run invoke lint (uses ruff format + ruff check)
  • [ ] Appropriate tests and docs have been updated

For more information about contributing, see Contribute.

After you submit your PR, keep the page open and monitor the statuses of the various checks made by our continuous integration process at the bottom of the page. Please fix any issues that come up and reach out on Slack if you need help. Thanks for contributing!

tyler-hoffman avatar May 09 '24 15:05 tyler-hoffman

Deploy Preview for niobium-lead-7998 canceled.

Name Link
Latest commit f69952517bf12d16f6265d80c8973f4467407e95
Latest deploy log https://app.netlify.com/sites/niobium-lead-7998/deploys/66cf2161b5e4c500086a6a2e

netlify[bot] avatar May 09 '24 15:05 netlify[bot]

:x: 12 Tests Failed:

Tests completed Failed Passed Skipped
4638 12 4626 280
View the top 3 failed tests by shortest run time
tests.render.test_inline_renderer test_inline_renderer_instantiation_error_message
Stack Traces | 0.004s run time
No failure message available
tests.validator.test_validator test_show_progress_bars_property_and_setter
Stack Traces | 0.005s run time
No failure message available
tests.core.test_expectation_suite.TestInit test_bad_expectation_configs_are_skipped
Stack Traces | 0.006s run time
self = <tests.core.test_expectation_suite.TestInit object at 0x7f0352cb1ea0>
bad_expectation_dict = {'kwargs': {}, 'meta': {'notes': 'this_should_explode'}, 'type': 'expect_stuff_not_to_go_well'}
expect_column_values_to_be_in_set_col_a_with_meta_dict = {'kwargs': {'column': 'a', 'value_set': [1, 2, 3, 4, 5]}, 'meta': {'notes': 'This is an expectation.'}, 'type': 'expect_column_values_to_be_in_set'}

    @pytest.mark.unit
    def test_bad_expectation_configs_are_skipped(
        self,
        bad_expectation_dict: dict,
        expect_column_values_to_be_in_set_col_a_with_meta_dict: dict,
    ):
>       suite = ExpectationSuite(
            name="test_suite",
            expectations=[
                bad_expectation_dict,
                expect_column_values_to_be_in_set_col_a_with_meta_dict,
            ],
        )

tests/core/test_expectation_suite.py:188: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
great_expectations/core/expectation_suite.py:109: in __init__
    self._store = project_manager.get_expectations_store()
.../data_context/data_context/context_factory.py:95: in get_expectations_store
    return self._project.expectations_store
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <great_expectations.data_context.data_context.context_factory.ProjectManager object at 0x7f0366185d80>

    @property
    def _project(self) -> AbstractDataContext:
        if not self.__project:
>           raise DataContextRequiredError()
E           great_expectations.exceptions.exceptions.DataContextRequiredError: This action requires an active data context. Please call `great_expectations.get_context()` first, then try your action again.

.../data_context/data_context/context_factory.py:91: DataContextRequiredError

To view individual test run time comparison to the main branch, go to the Test Analytics Dashboard

codecov[bot] avatar Aug 28 '24 13:08 codecov[bot]