• barsoap
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    3 months ago

    Downloading from github is how NixOS avoided getting hit. On unstable, that is, on stable a tarball gets downloaded (EDIT: fixed links).

    Another reason it didn’t get hit is that the exploit is debian/redhat-specific, checking for files and env variables that just aren’t present when nix builds it. That doesn’t mean that nix couldn’t be targeted, though. Also it’s a bit iffy that replacing the package on unstable took in the order of 10 days which is 99.99% build time because it’s a full rebuild. Much better on stable but it’s not like unstable doesn’t get regular use by people, especially as you can mix+match when running NixOS.

    It’s probably a good idea to make a habit of pulling directly from github (generally, VCS). Nix checks hashes all the time so upstream doing a sneak change would break the build, it’s more about the version you’re using being the one that has its version history published. Also: Why not?

    Overall, who knows what else is hidden in that code, though. I’ve heard that Debian wants to roll back a whole two years and that’s probably a good idea and in general we should be much more careful about the TCB. Actually have a proper TCB in the first place, which means making it small and simple. Compilers are always going to be an issue as small is not an option there but the likes of http clients, decompressors and the like? Why can they make coffee?

    • chameleon@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      3 months ago

      You’re looking at the wrong line. NixOS pulled the compromised source tarball just like nearly every other distro, and the build ends up running the backdoor injection script.

      It’s just that much like Arch, Gentoo and a lot of other distros, it doesn’t meet the gigantic list of preconditions for it to inject the sshd compromising backdoor. But if it went undetected for longer, it would have met the conditions for the “stage3”/“extension mechanism”.

      • barsoap
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        You’re looking at the wrong line.

        Never mind the lines I linked to I just copied the links from search.nixos.org and those always link to the description field’s line for some reason. I did link to unstable twice though this is the correct one, as you can see it goes to tukaani.org, not github.com. Correct me if I’m wrong but while you can attach additional stuff (such like pre-built binaries) to github releases the source tarballs will be generated from the repository and a tag, they will match the repository. Maybe you can do some shenanigans with rebase which should be fixed.

        • chameleon@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          For any given tag, GitHub will always have an autogenerated “archive/” link, but the “release/” link is a set of maintainer-uploaded blobs. In this situation, those are the compromised ones. Any distro pulling from an “archive/” link would be unaffected, but I don’t know of any doing that.

          The problem with the “archive/” links is that GitHub reserves the right to change them. They’re promising to give notice, but it’s just not a good situation. The “release/” links are only going to change if the maintainer tries something funny, so the distro’s usual mechanisms to check the hashes normally suffice.

          NixOS 23.11 is indeed not affected.

          • barsoap
            link
            fedilink
            arrow-up
            2
            ·
            3 months ago

            They’re promising to give notice, but it’s just not a good situation.

            cache.nixos.org keeps all sources so once hydra has ingested something it’s not going away unless nixos maintainers want it to. The policy for decades was simply “keep all derivations” but in the interest of space savings it has recently been decided to do a gc run, meaning that 22 year old derivations will still available but you’re going to have to build them from the cached source, the pre-built artifacts will be gone.