How is software packaged and distributed securely on Linux?

tl;dr: Summarizing why it's necessary for a signature/checksum chain of custody to have reproducible builds, and saying that the web of trust can be big and complicated.

Hi /u/TwistedTomZ , op here. What you say is true, but maybe a bit of an oversimplification.

It's true that a combination of gpg signatures and checksums are used to create a chain of custody, if you will. But thing get a little more complicated when you try to actually download the source code in the same order, and reproduce all the changes made.

This is because, even with the same source code, each CPU will compile a binary with a different checksum. This could lead you to believe that an attacker changed the binary at the end of the chain of custody (when the source code is compiled).

The reason for this is that CPU's are set up to optimize code when compiling it so it runs best on your particular chips instruction set, and other platform specific features (microarchitecture).

The concept of reproducible builds, it seems, is to compile all software to binaries in a tightly controlled environment so you can get the same checksum regardless of your processor.

Trust relationships tend to make the 'chain of custody even more complicated', since, when you have a lot of package maintainers they're able to countersign (endorse) each other and subordinates, expanding the web of trust.

I'm not sure how far the network of trust goes, or how it works, which is part of the reason I asked the question. Also, I just yesterday became familiar with the concept of reproducible builds, so take what I say with a grain of salt. I think it's fair to say it's amazing that we aren't all swimming in malware right now, though.

/r/linuxquestions Thread Parent