django-react-boilerplate
django-react-boilerplate copied to clipboard
Connecting to Postgresql database
Nothing is broken at all, but i went ahead and setup a connection to a persistent postgresql database, and wanted to lay out here what i did in case it's of use to anyone.
It's possible that there are better ways to do the steps below, but it works :)
Plus, feel free to use any of this if there are any additional articles in this great series!
Updates to files
Dockerfile
- install postgres client and server by adding before
WORKDIR /app:
# Install postgres client so we can access the db directly from command line (see below)
RUN apk add --update --no-cache postgresql-client
and (note, i had to add gcc for compile reasons)
RUN apk update \
&& apk add postgresql-dev gcc python3-dev musl-dev libffi-dev openssl-dev
docker-compose.yml
- add this to top of services section, above
nginx:
db:
image: postgres
restart: always
volumes:
- pgdata:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- update
volumessection at the bottom:
volumes:
static_volume: {}
pgdata: {}
- add this to end of
service/backendsection:
depends_on:
- db
settings.py
updates to django's backend/server/server/settings.py file. add this after the default sqlite3 db is set (change name of db/user/pwd as you wish; just make sure to change it to match in the docker-compose.yml file as noted above):
if not DEBUG:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
requirements.txt
addition to backend/requirements.txt:
psycopg2-binary>=2.8
Note, you'll likely need to comment this line out for locally. ideally we can add a server check so this doesn't get added locally when using sqlite3 default db.
That's it.
Doing those steps, it auto created the volume first time i launched, and the database connected successfully, and the db persists across sessions, even if i rebuild the docker images.
Working w/ the postgres db directly
This is only useful if you can login to your docker server via ssh and work through the command line. Once logged into the server via ssh, you do:
docker psto get container iddocker exec -it [container_id_here] shto get into postgres container as anshshellsu postgresto change to the user who owns the db situationpsqlto take you into psql so you can work with db.
to get out, just type 'exit' a few times.
Conclusion.
I really appreciated this tutorial, one of the most useful and i wanted to help w/ folks interested in persistent db connection.
Thank you @marqpdx! Great tutorial! :)
The database in docker is ok for development or for hobby projects. For production projects, I will recommend using with RDS service (or similar).
If you have DB connection setup ready, I would recommend for you for the next step to use python-decouple (to hide passwords from code).
Thank you @marqpdx once again!
Thanks @pplonski - i'll look into both of those and add any writeups that might be useful.
The main thing it seems is Postgresql outside of Docker, either using RDS or similar service, or even a dedicated VPS with the db setup and well tuned, yes?
All the best!