• Rustmilian@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      8 months ago

      You realize nobody would know about this in the first place if it was Proprietary, right?
      FOSS allows for whistleblowers, scrutiny, and audits. Proprietary ‘security via obscurity’ does not.

      • sir_reginald@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        8 months ago

        I’m perfectly aware of all that. but cryptography is an extremely complicated discipline that even the most experienced mathematicians have a hard time to design and scrutinize an algorithm, they heavily rely on peer review. If one major institution like the NIST is biased by the NSA, they will have a bigger chance of compromising algorithms if that are their intentions.

        • Rustmilian@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          8 months ago

          You’d be surprised what the world wide collective of Cryptographers are capable of when they’re able to scrutinize a project in the first place. Which would you prefer? A closed unscrutinizable encryption algorithm or one that’s entirely open from the ground up?
          NIST could do damage if they’re biased, but it’s not like people aren’t keeping a close eye on them and scrutinizing as many mistakes as possible. Especially for an algorithm as globally important as PQC.

          • sir_reginald@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            8 months ago

            I’m totally against anything proprietary. That’s the first requisite for anything I use. And I’m not advocating for proprietary algorithms at all, that would be very much the demise of encryption.

            I’m just worried that a sufficiently influent actor (let’s say a government) could theoretically bribe these institutions to promote weaker encryption standards. I’m not even saying they are trying to introduce backdoors, just that like the article suggest they might bias organizations to support weaker algorithms.

            AES 128 bits is still considered secure in public institutions, when modern computers can do much stronger encryption without being noticeable slower.

            • Rustmilian@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              8 months ago

              A huge amount of organization are already biased and using weaker algorithms… They just do so under the obscurity of proprietary software so it’s much harder to scrutinize them.