|
From: | Jacob Bachmeyer |
Subject: | Re: reproducible dists and builds (was: GNU Coding Standards, automake, and the recent xz-utils backdoor) |
Date: | Tue, 02 Apr 2024 20:30:49 -0500 |
User-agent: | Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.8.1.22) Gecko/20090807 MultiZilla/1.8.3.4e SeaMonkey/1.1.17 Mnenhy/0.7.6.0 |
Richard Stallman wrote:
[[[ To any NSA and FBI agents reading my email: please consider ]]] [[[ whether defending the US Constitution against all enemies, ]]] [[[ foreign or domestic, requires you to follow Snowden's example. ]]] > What would be helpful is if `make dist' would guarantee to produce the same > tarball (bit-to-bit) each time it is run, assuming the tooling is the same > version. Currently I believe that is not the case (at least due to timestamps) Isn't this a description of "reproducible compilation"?
No, but it is closely related. Compilation produces binary executables, while `make dist` produces a freestanding /source/ archive.
We want to make that standard, but progress is inevitably slow because many packages need to be changed.
I am not actually sure that that is actually a good idea. (Well, it is mostly a good idea except for one issue.) If compilation is strictly deterministic, then everyone ends up with identical binaries, which means an exploit that cracks one will crack all. Varied binaries make life harder for crackers developing exploits, and may even make "one exploit to crack them all" impossible. This is one of the reasons that exploits have long hit Windows (where all the systems are identical) so much harder than the various GNU/Linux distributions (where the binaries are likely different even before distribution-specific patches are considered).
Ultimately, this probably means that we should have both an /ability/ for deterministic compilation and either a compiler mode or post-processing pass (a linker option?) to intentionally shuffle the final executable.
-- Jacob
[Prev in Thread] | Current Thread | [Next in Thread] |