Deb Constrictor (my pure-Python DPKG building application) adds many features in version 0.3 to make building packages easier and to keep your configurations DRY. This is part one of three posts of how I use Deb Constrictor to build and release this web site (the one you’re reading right now).
The application is released in three packages: a configuration package, a package containing the virtual environment, and a package just for the application. This split is because:
- The virtualenv package doesn’t change that often, so separating it out speeds up build/release of the other components.
- The application package is small and quick to build/upload.
- Different configuration packages can be built for staging/production.
- Configuration changes can be deployed without application changes.
Basically having everything seperate speeds up all releases. This post focuses on just the application assembly.
Background
This site is built on Django with Python 2.7. It runs on a server also hosting Python 3 sites. A gulp script is used to assemble some of the static components, so this needs to be run before assembling the package. When the package is assembled it gets uploaded to an apt repository using scp. After installation, Django migrations need to applied and static files collected.
This is a fairly typical setup so it is easy to apply the same processes to other Django sites (or indeed other Python, PHP, or any type of application that follows the pre-process, assemble, upload, post-process formula).
Build Configuration File Inheritance
Deb Constrictor includes the constrictor-build
script which builds DPKGS (and can execute other scripts) by reading from JSON configuration files. Version 0.3 introduced recursive inheritance of configuration files, as well as reading a base file from your home directory. To illustrate how this works, I’ll start by showing the main build-config.json
file for the bbit-web-frontend project.
{ "parent": "../parent-config.json", "deb_constrictor": { "commands": { "prebuild": [ "./prebuild.sh" ] } }, "package": "${PROJECT_NAME}", "architecture": "all", "version": "1.20", "description": "BBIT Web Frontend.", "extra_control_fields": { "Section": "misc", "Priority": "optional", "Depends": [ "nginx", "uwsgi", "postgresql", "${PROJECT_NAME}-config", "${PROJECT_NAME}-virtualenv" ] }, "directories": [ { "source": "src", "destination": "/srv/python/${PROJECT_NAME}", "uname": "www-data", "gname": "www-data" } ], "maintainer_scripts": { "postinst": "scripts/after-install", "preinst": "scripts/before-install" } }
To explain this file a little, it instructs the package to be built so that the contents of the src
directory on the build machine are installed into /srv/python/bbit-web-frontend
on the server. The packages nginx, uwsgi and postgresql (plus virtualenv and config packages) must be installed before this package.
The parent
attribute is the path to another JSON file which this one will inherent from. Items in this file will either replace those in the parent (in most cases) or append to them (for example, variables). The interesting part of this file is the use of ${PROJECT_NAME}
throughout. This refers to a variable which will be interpolated. But where does this come from? The parent config JSON file. Its contents are:
{ "deb_constrictor": { "environment_variables": [ ["PROJECT_NAME", "bbit-web-frontend"] ] } }
Since the PROJECT_NAME
is used throughout the three related packages (config and virtualenv) it is defined once in this parent config that all packages use (for consistency and less repetition). This use of variables will be covered more in the post about packaging the virtual environment.
There is also a base constrictor-build-config.json
file in my home directory that is used as the base to apply configuration for all packages built for this user; I’ll go into that in more detail soon, but this is its contents.
{ "deb_constrictor": { "commands": { "postbuild": ["~/deb-constrictor/upload-deb.sh"] }, "ignore_paths": [ "/.git/*", "/*.sqlite3", "/*.pyc", "/*.idea", "*/.DS_Store" ] }, "maintainer": "Ben Shaw", "extra_control_fields": { "Section": "misc", "Priority": "optional" } }
Essentially I want the same ignore_paths
for all projects on the system, since I never want to accidentally include the files in those paths (this is new in 0.3). After building any package I want it to be uploaded to my apt repository, so the postbuild command to do the upload is defined in this base configuration too (also new in 0.3). The maintainer is always me (and no matter how much I like the packages I build they’re always optional).
Pre and Post Build Scripts
Also new in 0.3 is the commands
parameter to specify commands to run before and after building the dpkg. The parameters are provided as an array, as to be passed to subprocess.call. Any variables in the parameters will be interpolated.
The prebuild
parameter was specified in the project specific build-config.json
file, since it’s only for this project, and it just executes the gulp build command.
#!/usr/bin/env bash set -e cd npm gulp
The prebuild command is pretty simple. As mentioned, the postbuild
command is defined in the global config file and uploads the built package to an apt repository server.
#!/usr/bin/env bash cd ${DEB_CONSTRICTOR_WORKING_DIR} scp ${DEB_CONSTRICTOR_OUTPUT_PATH} redacted.bbit.co.nz:~/reprepro-incoming/
deb-constrictor
populates the DEB_CONSTRICTOR_WORKING_DIR
and DEB_CONSTRICTOR_OUTPUT_PATH
environment variables which can be used to find the built DPKG. In this case it uses scp
to upload to a directory that reprepro is watching. Since this script is defined in the base configuration any built package is automatically uploaded ready to install.
Putting It All Together
The only things I haven’t really mentioned are the preinst
and postinst
scripts. These are standard Debian maintainer scripts. Essentially they just create the databases and users (preinst) and then run migrations, collect static files, and restart uwsgi (postinst).
Releasing an upgrade of the site basically goes like this:
- Increment the version in the
build-config.json
file (bump2version can help with this). - Run
constrictor-build
, utilising the pre- and post-build commands automates everything. - SSH into the web server and
apt-get upgrade
(or automate this with ansible/salt, etc).
You can see how using Deb Constrictor can help make repeatable and automated builds with a one-step command to package your project. The new features in 0.3 help reduce configuration duplication and further automate the process. In the next post, I’ll go into building the virtual environment, including using Docker for repeatable virtualenv assembly regardless of the build platform.