fastapi-best-practices icon indicating copy to clipboard operation
fastapi-best-practices copied to clipboard

Potential Improvements

Open ThirVondukr opened this issue 1 year ago • 9 comments

First of all - it's an amazing set of best practices, I also want to share some things that I use:

Project Structure

src could be added to PYTHONPATH to avoid prefixing every app import with src, IDEs like PyCharm also support that.

Models also could be stored in same package, it's easier to import your models and make sure all modules were executed when you generate your migrations:

src/
  db/
    models/
      __init__.py
      comments.py
      posts.py
      tags.py
    base.py
    dependencies.py
# __init__.py
from . commends import Comment
from . posts import Post
from . tags import Tag

__all__ = ["Comment", "Post", "Tag"]
# Anywhere in the code
from db.models import Post

Continuous Integration

Absolutely use CI in gitlab/github to automate your tests and linters!

Dependency Management

Use poetry instead of requirements.txt, it's awesome!

Custom base model from day 0

This could also be used to enable orm_mode and set up custom alias_generator if client (for example a JS app) requires it.

Use ~~Starlette's Config object~~ Pydantic BaseSettings!

Pydantic has its own class to manage environment variables:

class AppSettings(BaseSettings):
    class Config:
        env_prefix = "app_"

    domain: str

ThirVondukr avatar Aug 20 '22 10:08 ThirVondukr

Hey, @ThirVondukr

Thank you for your kind words, I truly appreciate them!

  1. Although I find this suggestion very handy, I am not sure to advocate implicit solutions like PYTHONPATH modifications or __init__ imports. We did put some models there when it was appropriate though, just not an "every package" practice I believe.
  2. I agree, that CI is very important, but it feels like that recommendation doesn't fall into FastAPI best practices in particular.
  3. poetry is great, but I wish it would be easier to use it in docker containers :)
  4. Agree, I will add these suggestions.
  5. Agree, too many people were pointing it out. My key point was not to use other libraries but already installed ones though.

zhanymkanov avatar Aug 20 '22 11:08 zhanymkanov

Storing db models in same place is just a personal preference, but I feel like it's very usable in small applications/microservices.

You can use poetry in docker, either by exporting your dependencies as requirements.txt in a multi-stage build or directly:

FROM python:3.10-slim as build
ENV POETRY_VERSION=1.1.14
RUN pip install poetry==$POETRY_VERSION

COPY ./pyproject.toml ./poetry.lock ./
RUN poetry export -f requirements.txt -o requirements.txt


FROM python:3.10-slim as final

WORKDIR /app
ENV PYTHONPATH=$PYTHONPATH:src
COPY --from=build requirements.txt .
RUN pip install --upgrade pip && pip install -r requirements.txt --no-cache-dir

COPY ./src ./src
COPY alembic.ini ./

...

ThirVondukr avatar Aug 20 '22 12:08 ThirVondukr

Yep, agree. We actually do store all DB models in one folder in a small service we have for transcoding.

Well, your example with multi-stage looks pretty neat and clean! I didn't like the examples I have seen online, but will definitely try your example.

zhanymkanov avatar Aug 21 '22 03:08 zhanymkanov

I'd say there's also a "problem" with using sqlalchemy commits with fastapi. Usually you should use .flush() method to flush sql to database if you need a server-side generated data such as primary keys, you still can use nested transactions, but you don't need to in most cases. If you use a typical get_session dependency and commit inside of it then commit would actually happen after the response is sent, which can lead to client receiving 200/201 for example but data won't actually be saved:

async def get_session() -> AsyncIterable[AsyncSession]:
    async with async_sessionmaker.begin() as session:
        yield session

This could be avoided by either

  1. Committing manually in each request
  2. Using another framework/library to do DI for you
  3. (Best Option) Using a middleware to commit session in it

I think that also could be covered in this repo

ThirVondukr avatar Aug 21 '22 08:08 ThirVondukr

Storing db models in same place is just a personal preference, but I feel like it's very usable in small applications/microservices.

You can use poetry in docker, either by exporting your dependencies as requirements.txt in a multi-stage build or directly:

FROM python:3.10-slim as build
ENV POETRY_VERSION=1.1.14
RUN pip install poetry==$POETRY_VERSION

COPY ./pyproject.toml ./poetry.lock ./
RUN poetry export -f requirements.txt -o requirements.txt


FROM python:3.10-slim as final

WORKDIR /app
ENV PYTHONPATH=$PYTHONPATH:src
COPY --from=build requirements.txt .
RUN pip install --upgrade pip && pip install -r requirements.txt --no-cache-dir

COPY ./src ./src
COPY alembic.ini ./

...

I agree with the usage of poetry in combination with docker multi stage, this behavior is also suggested by official documentation.

Another thing I usually do is to create a install-poetry.sh script with the poetry version specified inside, in this way there is only one point of trust used by both Docker and the Developer.

Then simply include it into the docker first stage:

RUN bash install-poetry.sh
RUN poetry export -f requirements.txt --output requirements.txt --without-hashes

pistocop avatar Aug 26 '22 08:08 pistocop

First of all - it's an amazing set of best practices, I also want to share some things that I use:

Project Structure

src could be added to PYTHONPATH to avoid prefixing every app import with src, IDEs like PyCharm also support that.

Models also could be stored in same package, it's easier to import your models and make sure all modules were executed when you generate your migrations:

src/
  db/
    models/
      __init__.py
      comments.py
      posts.py
      tags.py
    base.py
    dependencies.py
# __init__.py
from . commends import Comment
from . posts import Post
from . tags import Tag

__all__ = ["Comment", "Post", "Tag"]
# Anywhere in the code
from db.models import Post

Continuous Integration

Absolutely use CI in gitlab/github to automate your tests and linters!

Dependency Management

Use poetry instead of requirements.txt, it's awesome!

Custom base model from day 0

This could also be used to enable orm_mode and set up custom alias_generator if client (for example a JS app) requires it.

Use ~Starlette's Config object~ Pydantic BaseSettings!

Pydantic has its own class to manage environment variables:

class AppSettings(BaseSettings):
    class Config:
        env_prefix = "app_"

    domain: str

I am using the method described here as a way to split secrets over different environments with pydantic settings: https://rednafi.github.io/digressions/python/2020/06/03/python-configs.html

b-bokma avatar Nov 25 '22 08:11 b-bokma

@b-bokma I'm currently using kubernetes and it's really easy to store your k8s secrets and configmaps in same format as your pydantic models:

class DatabaseSettings(BaseSettings):
    class Config:
        env_prefix = "database_"

    driver: str = "postgresql+asyncpg"
    sync_driver = "postgresql+psycopg2"
    name: str
    username: str
    password: str
    host: str

apiVersion: apps/v1
kind: Deployment
metadata:
  ...
spec:
  ...
  template:
    ...
    spec:
      ...
      containers:
        - name: fastapi
          ...
          envFrom:
            - secretRef:
                name: app-database
              prefix: database_

This also helps if you need to use same secret for different applications
So we have a different sets of secrets in different namespaces at this moment 😉

ThirVondukr avatar Nov 25 '22 16:11 ThirVondukr

Hey, @ThirVondukr

Thank you for your kind words, I truly appreciate them!

  1. Although I find this suggestion very handy, I am not sure to advocate implicit solutions like PYTHONPATH modifications or __init__ imports. We did put some models there when it was appropriate though, just not an "every package" practice I believe.
  2. I agree, that CI is very important, but it feels like that recommendation doesn't fall into FastAPI best practices in particular.
  3. poetry is great, but I wish it would be easier to use it in docker containers :)
  4. Agree, I will add these suggestions.
  5. Agree, too many people were pointing it out. My key point was not to use other libraries but already installed ones though.

I don't understand why not to use poetry right in docker? It is possible and very convenient to use the groups of dependencies provided by poetry?

The workflow is the following:

  1. install poetry tool by pip (the problem can only be in the size of this tool)
  2. copy pyproject.toml and poetry.lock files
  3. set env variable POETRY_VIRTUALENVS_CREATE to false, not to create env in docker container
  4. run poetry install --only main --no-root

PashaDem avatar Nov 16 '23 17:11 PashaDem

@PashaDem Because I personally don't think you need an extra dependency/tool inside of your docker container, if you need it for something - add it to your docker image.

ThirVondukr avatar Nov 26 '23 19:11 ThirVondukr