prisma-client-py
prisma-client-py copied to clipboard
using models as fastAPI response_model freezes fastAPI docs
Bug description
When i try to use a model from prisma.models as a response_model on a fastAPI endpoint, the fastAPI docs freeze when I try to open that endpoint to see the response type. I have tried to use partials to resolve this. I can only get it to work when i have no relational fields in the partial. I find it hard to believe its completely freezing just because the type is large. Anyone have experience with this?
- is there a way to control the depth of the relation generated for partials like we can do when querying?
How to reproduce
Here is a simple partial im generating to test this and its failing. pillar model also has some relations.
from prisma.models import Bot
ChatOverview = Bot.create_partial("ChatOverview", include={"pillar"})
and in fastAPI endpoint...
response_model=ChatOverview,
I haven't encountered this myself, what FastAPI version are you using?
I'd currently recommend searching the FastAPI repository to see if anyone else is running into this if you haven't already & if you can reproduce this without using Prisma then open an issue with FastAPI.