full-stack-fastapi-template
full-stack-fastapi-template copied to clipboard
Separating database for tests and dev
Each time the tests are run, a few random users are created. I can see that a problem when I want to test models, and then the database becomes cluttered really quickly, even if just in dev.
To have two databases, I tried to split env-postgres.env in two files: env-postgres.env env-postgres-test.env
Then in docker-compose.test.yml I changed the env_file entry to refer to the new env file.
After that, I run docker-compose up -d
When I try to run the tests, I get an error saying the database does not exist. If I create it manually it works, but more errors are showing because the migrations didn't run automatically. How is the database for the backend automatically created, and how can I do the same for the tests database?
Thanks
It looks like some of the tests access the API from the other container and does black box testing. I am not sure of a simple way to change the database to test these.
Some of the other tests (crud tests etc.) however, can be changed fairly easily to use a separate db.
What I did to achieve this is to add the following two fixtures in tests/conftest.py
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from app.db import session
from app.db.base_class import Base
from app.db import init_db
engine = None
@pytest.fixture(autouse=True, scope='session')
def database_session_fixture():
global engine
print("Before Session")
conn = session.engine.connect()
conn.execute('commit')
conn.execute('create database app_automated_tests')
sqlalchemy_database_uri = (
f"postgresql://{config.POSTGRES_USER}:{config.POSTGRES_PASSWORD}@{config.POSTGRES_SERVER}/app_automated_tests"
)
engine = create_engine(sqlalchemy_database_uri, pool_pre_ping=True)
session.db_session = scoped_session(
sessionmaker(autocommit=False, autoflush=False, bind=engine)
)
session.Session = session.db_session
yield
print('after session')
engine.dispose()
conn.execute('commit')
conn.execute("drop database if exists app_automated_tests")
@pytest.fixture(scope="function")
def db_session():
Base.metadata.create_all(engine)
sess = session.Session()
print("INit DB")
init_db.init_db(sess)
print("Yield")
yield sess
sess.close()
Base.metadata.drop_all(bind=engine)
I then updated the crud tests, for example in test_item, I removed the following import
from app.db.session import db_session
and updated each of the tests to take a parameter of db_session
I also had to update the create_random_user
to take a db_session
and use that instead of the global
tests/crud/test_item.py
import pytest
from app import crud
from app.schemas.item import ItemCreate, ItemUpdate
from app.tests.utils.user import create_random_user
from app.tests.utils.utils import random_lower_string
def test_create_item(db_session):
title = random_lower_string()
description = random_lower_string()
item_in = ItemCreate(title=title, description=description)
user = create_random_user(db_session)
item = crud.item.create_with_owner(
db_session=db_session, obj_in=item_in, owner_id=user.id
)
assert item.title == title
assert item.description == description
assert item.owner_id == user.id
def test_get_item(db_session):
title = random_lower_string()
description = random_lower_string()
item_in = ItemCreate(title=title, description=description)
user = create_random_user(db_session)
item = crud.item.create_with_owner(
db_session=db_session, obj_in=item_in, owner_id=user.id
)
stored_item = crud.item.get(db_session=db_session, id=item.id)
assert item.id == stored_item.id
assert item.title == stored_item.title
assert item.description == stored_item.description
assert item.owner_id == stored_item.owner_id
def test_update_item(db_session):
title = random_lower_string()
description = random_lower_string()
item_in = ItemCreate(title=title, description=description)
user = create_random_user(db_session)
item = crud.item.create_with_owner(
db_session=db_session, obj_in=item_in, owner_id=user.id
)
description2 = random_lower_string()
item_update = ItemUpdate(description=description2)
item2 = crud.item.update(db_session=db_session, db_obj=item, obj_in=item_update)
assert item.id == item2.id
assert item.title == item2.title
assert item2.description == description2
assert item.owner_id == item2.owner_id
def test_delete_item(db_session):
title = random_lower_string()
description = random_lower_string()
item_in = ItemCreate(title=title, description=description)
user = create_random_user(db_session)
item = crud.item.create_with_owner(db_session=db_session, obj_in=item_in, owner_id=user.id)
item2 = crud.item.remove(db_session=db_session, id=item.id)
item3 = crud.item.get(db_session=db_session, id=item.id)
assert item3 is None
assert item2.id == item.id
assert item2.title == title
assert item2.description == description
assert item2.owner_id == user.id
and
create_random_user
from tests/utils/user.py
def create_random_user(db_session):
email = random_lower_string()
password = random_lower_string()
user_in = UserCreate(username=email, email=email, password=password)
user = crud.user.create(db_session=db_session, obj_in=user_in)
return user
You will have to update all the cases where db_session
is used to use the fixture instead.
I am also planning on using the FastAPI TestClient for testing the API instead of accessing it from the other docker container. By doing this, it should utilise the test database for that as well.
Hope that helps :)
For the blackbox tests, you could probably use https://github.com/tiangolo/full-stack-fastapi-postgresql/issues/65 to run uvicorn in the test container and pass a reference to that as a fixture.
@drone-ah thanks for your help. I stopped using this repo after trying to get it running for weeks, and I have not even tried the swarm production yet. I will certainly come back to it as it gains a bit more maturity, until then I will stay in Laravel, which for me I can set it up in 1 day in Heroku and get coding. It certainly has great potential, but I am not quite ready as a newcomer to Python, Docker, SQLAlchemy, etc... to solve all the issues to get this running out-of-the-box. I made a PR to improve the setup instructions for Windows users, but more is needed. For instance why is Jupyter included and how to use it? The learning curve for this repo is too high for me.
I'll definitely come back for another project, and I'll be happy to take the time to improve the tutorial as I discover this framework from a newcomer perspective.
@robido, I agree that there is a bit of a steep learning curve in adoption, mainly because there is a lot of things being set up. I should point out that I only started using this a few days back myself - but had the same requirement as you - so wanted to pitch in ;)
I spent a lot of time in PHP land myself (mainly Drupal) and Python takes a little bit of getting used to for sure, but I haven't looked back.
I managed to get the uvicorn part working as well for the separation of tests as well.
I set up a test server entrypoint:
tests/test_server.py
from fastapi import FastAPI
from app import main
from core import session, config
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
sqlalchemy_database_uri = (
f"postgresql://{config.POSTGRES_USER}:{config.POSTGRES_PASSWORD}@{config.POSTGRES_SERVER}/app_automated_tests"
)
engine = create_engine(sqlalchemy_database_uri, pool_pre_ping=True)
db_session = scoped_session(
sessionmaker(autocommit=False, autoflush=False, bind=engine)
)
Session = db_session
app = main.app
I then set up the fixture in tests/conftest.py
from multiprocessing import Process
import uvicorn
import time
def run_server():
uvicorn.run("tests.test_server:app", port=8123)
@pytest.fixture(scope="module")
def server_api():
proc = Process(target=run_server, args=(), daemon=True)
proc.start()
# maybe some sleep here to wait for server starts
time.sleep(0.5)
yield f"http://localhost:8123"
proc.kill() # Cleanup after test
I then used the fixture as mentioned in https://github.com/tiangolo/full-stack-fastapi-postgresql/issues/65
Turns out this was necessary for all the tests to pass. The tests use information from the database to make the API calls, and fail if they don't match.
With this place, the testing is entirely self contained.
If it's useful, I could make the updates to the repo and make a pull request.
@drone-ah that is great thanks! I'll try it out when I am ready to try again with this framework. Cheers.
Also would like to add after reading the documentation it would also be possible to override your dependencies. See Testing a Database and Settings and testing. So in this instance you can override get_db
and switch it out with a testing database.
Here is the solution I ended up going with.
tests/utils/test_db.py
from pydantic import PostgresDsn
from sqlalchemy.engine import create_engine
from sqlalchemy.orm.session import sessionmaker
from app.core.config import settings
SQLALCHEMY_DATABASE_URL = PostgresDsn.build(
scheme="postgresql",
user=settings.POSTGRES_USER,
password=settings.POSTGRES_PASSWORD,
host=settings.POSTGRES_SERVER,
path=f"/{settings.POSTGRES_DB}_test"
)
engine = create_engine(SQLALCHEMY_DATABASE_URL, pool_pre_ping=True)
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
tests/utils/overrides.py
from .test_db import TestingSessionLocal
def override_get_db():
try:
db = TestingSessionLocal()
yield db
finally:
db.close()
tests/conftest.py
from typing import Generator
import pytest
from fastapi.testclient import TestClient
from app.api.deps import get_db
from app.db.base import Base
from app.main import app
from .utils.overrides import override_get_db
from .utils.test_db import TestingSessionLocal, engine
app.dependency_overrides[get_db] = override_get_db
@pytest.fixture(scope="session")
def db() -> Generator:
Base.metadata.drop_all(bind=engine)
Base.metadata.create_all(bind=engine)
yield TestingSessionLocal()
@pytest.fixture(scope="module")
def client() -> Generator:
with TestClient(app) as c:
yield c
@alexvanzyl Nice!
I also had to create the test db in scripts/test.sh
:
docker-compose -f docker-stack.yml exec db sh -c 'psql -U ${POSTGRES_USER} -c "DROP DATABASE IF EXISTS ${POSTGRES_DB}_test;" && psql -U ${POSTGRES_USER} -c "CREATE DATABASE ${POSTGRES_DB}_test;"'
docker-compose -f docker-stack.yml exec -T backend bash /app/tests-start.sh "$@"
docker-compose -f docker-stack.yml exec db sh -c 'psql -U ${POSTGRES_USER} -c "DROP DATABASE IF EXISTS ${POSTGRES_DB}_test;"'
Here is the solution I ended up going with.
tests/utils/test_db.py
from pydantic import PostgresDsn from sqlalchemy.engine import create_engine from sqlalchemy.orm.session import sessionmaker from app.core.config import settings SQLALCHEMY_DATABASE_URL = PostgresDsn.build( scheme="postgresql", user=settings.POSTGRES_USER, password=settings.POSTGRES_PASSWORD, host=settings.POSTGRES_SERVER, path=f"/{settings.POSTGRES_DB}_test" ) engine = create_engine(SQLALCHEMY_DATABASE_URL, pool_pre_ping=True) TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
tests/utils/overrides.py
from .test_db import TestingSessionLocal def override_get_db(): try: db = TestingSessionLocal() yield db finally: db.close()
tests/conftest.py
from typing import Generator import pytest from fastapi.testclient import TestClient from app.api.deps import get_db from app.db.base import Base from app.main import app from .utils.overrides import override_get_db from .utils.test_db import TestingSessionLocal, engine app.dependency_overrides[get_db] = override_get_db @pytest.fixture(scope="session") def db() -> Generator: Base.metadata.drop_all(bind=engine) Base.metadata.create_all(bind=engine) yield TestingSessionLocal() @pytest.fixture(scope="module") def client() -> Generator: with TestClient(app) as c: yield c
Actually it can be a lot easier.
We can override the POSTGRES_DB variable:
in docker-compose.yml
:
# Allow explicit env var override for tests
- SMTP_HOST=${SMTP_HOST}
- POSTGRES_DB=${POSTGRES_DB}
in scripts/test.sh
:
DOMAIN=backend \
SMTP_HOST="" \
TRAEFIK_PUBLIC_NETWORK_IS_EXTERNAL=false \
INSTALL_DEV=true \
POSTGRES_DB="app_test" \
docker-compose \
-f docker-compose.yml \
config > docker-stack.yml
@lazToum that's actually perfect! I was running my tests inside an already running container so I resorted to using SQLAlchemy-Utils to check if the DB exists and create it.
@pytest.fixture(scope="session")
def db() -> Generator:
if not database_exists(SQLALCHEMY_DATABASE_URL):
create_database(SQLALCHEMY_DATABASE_URL)
Base.metadata.drop_all(bind=engine)
Base.metadata.create_all(bind=engine)
yield TestingSessionLocal()
It's a work in progress but you can check my implementation on https://github.com/sheoak/fastapi-backend/blob/1f5471e030ce52483de5e803704e955e4d9ed4b5/app/tests/conftest.py#L117
I reset the database after every integration test using a rollback.
This thread saved my night! Thank you all, keep up the good work 😎 💪
@alexvanzyl Thanks a lot for your solution!
The only thing i'm running into with this is, the overriden parts get_db()
not being included by coverage (obviously :P).
Is there any drawback of attaching a # pragma: no cover
to this function?
Or should there be a separate test for this then?