A post about Python programming with a tongue-twisting title…
Most Python projects rely on libraries (packages) from elsewhere, in particular from PyPI. Although it means you have to manually check for new versions of this third-party code, it’s a good idea to explicitly “pin” the version of each dependency, and this is usually done in a file called requirements.txt. This way, you know that what you use in development is the same as what you deploy. Pip allows you to specify version ranges (for example, Django>=1.4.0,<1.5). However, without “pinning” to specific, exact versions you can’t be sure that a new release of some package won’t happen between the time that you install your development environment and deploy time.
Tools exist to allow you to check whether your pinned packages have become out of date. pip-tools solves this problem, and reports the very newest published version of every one of your packages. What if you want to automatically inspect your requirements.txt and discover new versions, but only for bug fixes or security updates (what semantic versioning refers to as “patches”)? You might want to do this to avoid pulling in reasonably-significant changes between, say, version 1.4 and version 1.5 of some package (I’ve written before about the challenges of upgrading third party code and the importance of test libraries when doing so).
Here’s how I did this recently. Realising that <1.4.999 is effectively the same as <1.5, but much easier to derive in a regular expression: