pytest
pytest copied to clipboard
Add new fixture `cache_result` param (default True) to ignore fixture cache and re-execute on each usage of the fixture
What's the problem this feature will solve?
Reduce code duplication. Allow leveraging the setup,teardown features of pytest fixtures while allowing fixtures to be a bit more re-usable, like functions
Describe the solution you'd like
An ability to declare fixtures to not use their stored cache, and instead run and re-calculate on every reference
i.e. Say I were testing my backend server which managed a database with 2 objects definitions: Projects and ProjectConfigurations. When testing basic get functionalities, I would declare a fixture which creates a project, and another which creates a project-configuration
@pytest.fixture
async def project(
client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
project = await tools.create(
client=client,
project_create=models_factory.ProjectCreateFactory.build(),
)
yield project
await delete_project(project_id=project.id)
@pytest.fixture
async def project_config(
project: Project,
client: AsyncClient,
engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
project_config = await tools.create(client=client, project_id=project.id)
yield project_config
async with engine.connect() as connection:
res = await connection.execute(
delete(ProjectConfig).where(
project_config.id == ProjectConfig.id
)
)
assert res.rowcount == 1
await connection.commit()
This is already, a-lot of code, but fair. These are functionalities we must declare.
Now, when moving on to testing basic list functionalities, I would want more than 1 object in the db for each type during the test. I have to duplicate said fixtures above, or create a brand new fixture that creates X objects in my db as setup. i.e.
@pytest.fixture
async def project(
client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
project = await tools.create(
client=client,
project_create=models_factory.ProjectCreateFactory.build(),
)
yield project
await delete_project(project_id=project.id)
@pytest.fixture
async def project_config(
project: Project,
client: AsyncClient,
engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
project_config = await tools.create(client=client, project_id=project.id)
yield project_config
async with engine.connect() as connection:
res = await connection.execute(
delete(ProjectConfig).where(
project_config.id == ProjectConfig.id
)
)
assert res.rowcount == 1
await connection.commit()
async def project2(
client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
project = await tools.create(
client=client,
project_create=models_factory.ProjectCreateFactory.build(),
)
yield project
await delete_project(project_id=project.id)
@pytest.fixture
async def project_config2(
project: Project,
client: AsyncClient,
engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
project_config = await tools.create(client=client, project_id=project.id)
yield project_config
async with engine.connect() as connection:
res = await connection.execute(
delete(ProjectConfig).where(
project_config.id == ProjectConfig.id
)
)
assert res.rowcount == 1
await connection.commit()
Now my list test will use project_config2
and project2
Here the code starts getting clustered and confusing. And as tests pile on and my fixtures grow, I have to be careful that two fixtures that I am using are not coupled to each other when writing tests, as this could lead to unexpected behaviors. This makes a very large test code base that can be hard to manage.
In my opinion, there could be a better way!
At the end of the day, I want to create objects in the DB. I care for them being there, and I care for them being removed when my test finishes so that other tests won't be affected by this. In my case, I do not care that the references to it will access the same object, as I usually have a linear usage for each fixture.
If I could declare a fixture as to not use its cache, my issue would be solved, I could reference a fixture as many times as I want, without duplicating code and still getting new objects, while still enjoying the benefits of pytest's reliable setup, teardown flows. I would not need the change my exiting fixture infrastructure when testing new components of the same objects in my BE.
It would look something like this:
@pytest.fixture(cache_result=False)
async def project(
client: AsyncClient,
) -> AsyncGenerator[Project, typing.Any]:
project = await tools.create(
client=client,
project_create=models_factory.ProjectCreateFactory.build(),
)
yield project
await delete_project(project_id=project.id)
@pytest.fixture(cache_result=False)
async def project_config(
project: Project,
client: AsyncClient,
engine
) -> AsyncGenerator[ProjectConfig, typing.Any]:
project_config = await tools.create(client=client, project_id=project.id)
yield project_config
async engine.connection() as connection:
res = await connection.execute(
delete(ProjectConfig).where(
project_config.id == ProjectConfig.id
)
)
assert res.rowcount == 1
await connection.commit()
@pytest.fixture
def project_config2(project_config) -> ProjectConfig:
return project_config
def test1(project_config, project):
...
def test_list(project_config, project_config2):
assert project_config != project_config2
assert len(list_configs()) == 2
There are of course many more capabilities which come with this feature, this is just one of the useful usages.