OP's article starts with "The following is a guest post by NCommander of SoylentNews fame!", so I don't think there are any real attribution issues here.
I'm fine with either showing up, although it originally was posted to SoylentNews (which I'm a site administrator), and then sent to neozeed to post as SN deals with news more like HN and I didn't want it to get lost in the archives. It's also on DEFCON 201's blog.
I wonder if this is the case: the link shows a 320-page patent translated in 3 languages with a tremendous work behind it. It's definitely not patent-trolling.
What makes it more interesting to today's situation is that it was made by people from Institut Pasteur in France which is a reference in the field.
Umm ... requirements.txt and pypi? Sbt/Maven/Gradle and maven/ivy repos? package.json?
Unless you setup a local caching, non-deleting, JFrog artifactory for all your dependencies (only jar files are covered in the free version BTW, and there aren't any tools that handle all those repo types in the OSS world), you are going to depend on external URLs.
This is why my build systems will only use release tarballs that can be cached. I built HashCache [0][1] to handle caching of these tarballs by their SHA-256, then the build system can fetch from there first to ensure a local mirror is available. My download process sometimes also caches into my home directory (into ~/.local/cache/by-hash/sha256/...) to avoid hitting the network at all if possible.
Also, some tools try to download things from the internet as part of the BUILD process. I also disable that using another tool [2].
Combined this really helps in being able to build my software when upstream goes away.
And it's hardly trivial to derisk. People grow apart (even if you were perfect together at marriage), and the cost of splitting can be enormous (emotionally, financially, and socially).