pytest-django
pytest-django copied to clipboard
Where can we add SQL for schema creation in Postgres?
I've got some database tables that exist in a separate Postgres schema called legacy. When using pytest-django to set up a test database, I get this error when it tries to apply the migrations:
django.db.utils.ProgrammingError: schema "legacy" does not exist
That makes good sense as it's just created a fresh test database and that schema, in fact, does not exist.
However, I'm really struggling to see where I can hook into pytest-django's ecosystem of fixtures in order to create the schema I need before the migrations are applied.
I've read some some other related issues, but haven't come across a definitive answer.
I tried adding an autouse fixture in conftest.py, but I think it would run too late in the process, and so my test run ends with the error above.
import pytest
from django.db import connection
@pytest.fixture(scope='session')
def legacy_schema(django_db_setup, django_db_blocker):
# https://docs.djangoproject.com/en/3.1/topics/db/sql/#executing-custom-sql-directly
with django_db_blocker.unblock():
with connection.cursor() as cursor:
cursor.execute('CREATE SCHEMA legacy;')
Can anyone point me in the right direction? Or if this is genuinely impossible with pytest-django, let me know so I can pick a different test runner.
I've also tried following the example suggested in the docs to override django_db_setup in conftest.py but that doesn't seem to be working for me, either.
@pytest.fixture(scope='session')
def django_db_setup(django_db_setup, django_db_blocker):
with django_db_blocker.unblock():
with connection.cursor() as cursor:
cursor.execute('CREATE SCHEMA legacy;')
I'm not convinced this is an elegant or maintainable solution, but for the benefit of other readers who might have this issue, I was able to get this working by overriding django_db_setup and doing the DB creation and teardown there directly.
My conftest.py looks like this:
import json
from functools import partial
import psycopg2
import pytest
from django.core.management import call_command
from django.db import connection, connections
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT
TEST_DB_NAME = 'test_db'
# https://pytest-django.readthedocs.io/en/latest/database.html#examples
def _run_sql(settings, sql):
conn = psycopg2.connect(
dbname='postgres',
host=settings['HOST'],
port=settings['PORT'],
user=settings['USER'],
password=settings['PASSWORD'],
)
conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
cur = conn.cursor()
cur.execute(sql)
conn.close()
@pytest.yield_fixture(scope='session')
def django_db_setup(django_db_blocker):
from django.conf import settings
settings.DATABASES['default']['NAME'] = TEST_DB_NAME
run_sql = partial(_run_sql, settings.DATABASES['default'])
run_sql(f'DROP DATABASE IF EXISTS {TEST_DB_NAME}')
run_sql(f'CREATE DATABASE {TEST_DB_NAME}')
with django_db_blocker.unblock():
with connection.cursor() as cursor:
cursor.execute('CREATE SCHEMA IF NOT EXISTS legacy;')
call_command('migrate')
yield
for conn in connections.all():
conn.close()
run_sql(f'DROP DATABASE {TEST_DB_NAME}')
This bypasses a lot of the machinery in pytest-django and Django's own django.test.utils.setup_databases(), so I'm not sure it's really a good idea.
Are you reusing the test DB (keepdb) or creating it anew every time?
Right now, we're tearing it down after every test run and re-creating fresh.
What you want to do is to run a step between when the DB is created and when the migrations run. The trouble is, that pytest-django (or really Django itself) combines these steps, which makes it impossible to hook in the middle there.
Have you considered creating the schema itself in a migration? That should be robust and will fix such issues.
Have you considered creating the schema itself in a migration?
That's a pretty good suggestion! The only problem is I'd want that to run as the 0th migration, and it seems like a pain to retrofit it into an existing series of migrations (especially since there are "live" databases managed by these migrations). Maybe if I were starting a new project fresh.
Assuming that your production DB has already the schema created, you can just "cheat" and retro-edit it into your initial migration. That shouldn't actually cause any issues.
@bjmc @bluetech Have you found any solution, any nice solution? My data/models are in another schema (non-public). I'm using docker so I hook a create schema script in /docker-entrypoint-initdb.d/ and docker create the schema while initializing the psql.
But when I run tests Django creates a new test_db which obviously will not have any schema. And all the test fails. Anything to suggest ?
I never found a nice solution. Unless things have changed in the last couple years, Django's support for non-public schemas should be considered experimental at best. My advice would be to try and keep everything in one public schema. You can achieve some measure of separation using Django "applications"
I never found a nice solution. Unless things have changed in the last couple years, Django's support for non-public schemas should be considered experimental at best. My advice would be to try and keep everything in one
publicschema. You can achieve some measure of separation using Django "applications"
Thanks for the suggestions.
Creating the schema in a pre_migrate receiver is working for us:
@receiver(pre_migrate)
def create_schema(sender: AppConfig, **kwargs: Any) -> None:
with connection.cursor() as cursor:
cursor.execute("CREATE SCHEMA IF NOT EXISTS <schema>;")