ccbv icon indicating copy to clipboard operation
ccbv copied to clipboard

Docker to run the project using local environment vars

Open arruda opened this issue 9 years ago • 4 comments

Hi there, Github spoiled the surprise here.. but yey \o/ a docker to run the project =)

I'm also using Fig to make link the containers used, and set the environment vars more easily.

Here how it works:

All dockers are located in dockers folder.

DB

There is a DB container (PostgreSQL) that uses this env vars to build a empty table, ready to use:

- USER=ccbv #DB user
- DB=ccbv #DB name
- PASS=ccbv #DB user pass

So when this container start it will prepare a postgresSQL server with that info. Also the all PG data can be accessed from the host machine inside: ./dockers/docker-postgresql/data.

Django App

The django app container uses this env vars:

- USER=admin2
- [email protected]
- USER_PASS=pass2
- STATIC_URL=/static/
- DEBUG=True

where USER, USER_EMAIL and USER_PASS are used to create an super-user automatically.

The django container is linked with the DB container, and whenever it starts it will do this:

  • setup the DATABASE_URL env var to point to the address of the connected DB container, with the user, password and database name needed to connect, all got from available env vars.
  • run pip install -r requirements.txt (if it's the first time then it will install everything, if not it will just check if there is something new).
  • wait until the DB container is ready to accept connection, just in case the DB container take a long time to start.
  • if is the first time the container is running:
    • ensure the DB is clean (will drop all tables using `manage.py reset_db)
    • sync the DB
    • run the DB migrations
    • load the fixtures (project.json, and 1.3.json-1.7.json)
    • create the super user (with the environment variable passed to the container)
  • if its not the first time it's running, then:
    • will run the DB migrations just to make sure that there are no new migrations to be run
  • and to finish it runs django, with manage.py runserver 0.0.0.0:8000

Checking the logs:

After you start the containers with fig up -d they'll run in background, so if you want to check in what step of the process they're in, just run:

fig logs

Stopping the containers:

To stop a container (will keep the volumes) just run:

fig stop

Starting stopped containers:

If the containers where stopped and you want to bring them back again in the same state as they where, just run:

fig start

Django reloading on change

Since the hole project folder is added to the container at the time that it runs (not when it builds) whenever you change something locally in the project, this will affect the container app, and so you'll still be able to take advantage of Django's server reloading on change to anyfile

And I think that's it.

arruda avatar Dec 04 '14 01:12 arruda

Wow this is a lot of work, thank you!

I'm not a docker expert, but have played with fig a little, and have a feeling that it could be a little simpler. For example, if the django container were built upon python:2.7-onbuild, there should be less involved in the setup there. What's more, there is already an official postgres container that could be put to use, meaning that we don't have to play with git submodules.

I also wonder about the necessity of the scripts. Could these steps not be added to the Dockerfiles?

Thanks again for all this, it's very exciting!

meshy avatar Dec 04 '14 10:12 meshy

Hummm... It's been a while since I have used postgres official image, at the time it didn't have an option on how to setup a database after it started, I'll see to change it for the official one and add a script to create a database for the app to connect.

Now about adding the scripts for the django add to docker, I'm going to add some inline comments in the commits... It's better to visualize.

arruda avatar Dec 04 '14 17:12 arruda

Ok, I've commented some parts in the start.sh, the ones in first_start.sh and normal_start.sh I don't think they can be placed in the Dockerfile, since all of then use the manage.py command, and so they need to have a database connection (south complains if not). And so, you need to have a DB container linked during this time, and that can't be done in the Build phase (when the Dockerfile is used) =/

arruda avatar Dec 04 '14 17:12 arruda

Hi there, I've done some updates in everywhere I could:

  • Changed the postgresql docker for the official image,
  • changed the django docker to use python:2-wheezy (not using the onbuild since we need to have the libmemcached-dev installed before the pip install -r requirements.txt is run, and using the onbuild would run pip install -r requirements.txt first =/
  • moved the pip install -r requirements.txt from the start.sh script to the Dockerfile

arruda avatar Dec 06 '14 02:12 arruda

Apologies @arruda, I've left this here unanswered for too long.

My apologies, but I'm going to close this. The direction that I'm taking the project in should mean that a docker container will not be required, and may actually hamper development.

Thank you for your submission, and apologies once again for the long wait time on the reply.

meshy avatar Nov 18 '22 10:11 meshy

No problem =)

arruda avatar Jan 20 '23 20:01 arruda