Skip to content

Deployment

Jonathan Stray edited this page Jul 29, 2017 · 14 revisions

To deploy to cjworkbench.org, you will need to get comfortable with the wonderful world of Docker!

Creating Docker images

We build our images on Docker Hub, triggered by commits to the main repo. The Dockerfiles which define the images are here.

For rapid iteration, the build is split into two images: cjworkbench-reqs starts with a base python image and installs all of the modules in requirements.txt. (Dockerfile) This can take quite a long time, maybe 30 minutes, as some of the modules (like pandas) require compilation For this reason this image does not build automatically; you must trigger it whenever you change requirements.txt or requirements-dev.txt.

The second image is the main cjworkbench which clones the repo, installs npm modules, and creates a default admin user. (Dockerfile) It's set up to rebuild on dockerhub whenever anything is committed to the main repo.

Deploying to server

We use Docker Compose to tie together the cjworkbench container with a postgres container and configure a few things. The [cjworkbench-docker](https://github.com/CJWorkbench/cjworkbench-docker) repository gets cloned to the server, and contains the docker-compose.yml file that specifies how everything is stitched together. It also defines the Docker volumes that save the files from two place: the database and the importedmodules directory. All other file are erased when the docker image is updated.

To set up a server, first install Docker Compose. Then clone cjworkbench-docker:

git clone https://github.com/cjworkbench/cjworkbench-docker

To bring up the server,

source .env
cd cjworkbench-docker
docker-compose up -d

This pulls all necessary images, including a Postgres database. The .env file contains settings for all sorts of environment variables, including secret keys of various types. It must never be committed to a repo.

You can see the running containers with docker ps. The server is started inside the cjw-web container, via the start-prod.sh script and runs on port 8000 by default.

Updating the server

In the cjworkbench-docker directory, do docker-compose pull docker-compose up -d

Running commands inside the Docker container

You can run Django manage commands inside the container. In particular, the first time you run, you will need to manually create an admin user. After the server is up, do

docker exec -i -t cjw-web python manage.py createsuperuser

You can also run arbitrary Python code from within the context of the server process by doing something like

docker exec -i -t cjw-web python manage.py shell -c "from django.contrib.auth.models import User; User.objects.get(email='[email protected]').delete()"

To get a shell inside a running container,

docker exec -i -t <container name> /bin/bash

where container-name is cjw-web or cjw-db

Viewing logs

From outside the container, do

docker logs cjw-web

The same thing will work on cjw-db if you want to see Postgres logs. You can also add -f to follow the logs, that is, watch them print out in realtime.

Running the Docker image in development mode

First pull down the image. You'll need to repeat this whenever it needs an update.

docker pull jonathanstray/cjworkbench

Then start the container. You can do this interactively (container goes away with ^C)

docker run -it --rm -p 8000:8000 cjworkbench

Remove the --rm flag if you want the container and its contents to persist -- e.g. all users and created workflows. Or you can run as a background process:

docker run -p 8000:8000 cjworkbench

Either way, since this is debug mode, all data is stored in sqlite.db as usual.