• ilickfrogs@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    E2E encrypted communication can be used for nefarious things, that’s a fact. But it’s something that needs to be standardized because the accessibility of any and all private communication or information to so few individuals can be used so much more nefariously. Really wish people were more concerned about data privacy. It’s not about how your data will be use against you… It’s about how OUR data is used against US.

  • Dark_Blade@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    Good. Despite all the mistakes they make, at least Apple seems to be willing to learn from some of ‘em and stand up for their users (even if only a little).

    • shinjiikarus@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I actually don’t think this has anything to do with standing up for their users but is a simple cost/benefit analysis: building compromised E2E-communication that is still reasonably secure against bad actors is much more difficult (if not impossible) than building robust E2E-communication. Apple just doesn’t want to lose business users over headlines like „iOS messaging used by Chinese spies to steal US trade secrets“, while headlines about how difficult it is for government agencies to unlock iPhones probably drive sales. Nothing morally or ethical here, only profits.

        • 3rdBlueWizard@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yep. Honestly, if there’s a good profit motive to do the right thing, I trust companies far more to actually do the right thing. We WANT there to be profit in doing the right thing. When there isn’t, they don’t.

        • heirloomvegtattoo@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          They just launched a whole ad campaign based around imessages encryption as well… not supporting would be a bad look and a waste of ad dollars

  • generalpotato@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Didn’t Apple try to introduce this and got a ton of flak from all sorts of privacy “experts”? They then scrapped their plans, did they not? How is this any better/different? Any sort of “backdoor” into encryption means that the encryption is compromised. They tackled this in 2014 in the US. Feels like deja vu all over again.

    • AlexKingstonsGigolo@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      @generalpotato Ish. I read the technical write up and they actually came up with a very clever privacy-focused way of scanning for child porn.

      First, only photos were scanned and only if they were stored in iCloud.

      Then, only cryptographic hashes of the photos were collected.

      Those hashes were grepped for other cryptographic hashes of known child porn images, images which had to be in databases of multiple non-governmental organizations; so, if an image was only in the database of, say, the National Center For Missing And Exploited Children or only in the database of China’s equivalent, its cryptographic hash couldn’t be used. This requirement would make it harder for a dictator to slip in a hash to look for dissidents by making it substantially more difficult to get an image in enough databases.

      Even then, an Apple employee would have to verify actual child porn was being stored in iCloud only after 20 separate images were flagged. (The odds any innocent person even makes it to this stage incorrectly was estimated to be something like one false positive a year, I think, because of all of the safeguards Apple had.)

      Only after an Apple employee confirmed the existence of child porn would the iCloud account be frozen and the relevant non-government organizations alerted.

      Honestly, I have a better chance of getting a handjob from Natalie Portman in the next 24 hours than an innocent person being incorrectly reported to any government authority.

      • MisuseCase@infosec.pub
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        It would have worked and it would have protected privacy but most people don’t understand the difference between having a hash of known CSAM on your phone and having actual CSAM on your phone for comparison purposes and it freaked people out.

        I understand the difference and I’m still uncomfortable with it, not because of the proximity to CSAM but because I don’t like the precedent of anyone scanning my encrypted messages. Give them an inch, etc.

      • ansik@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Great writeup! I tried searching but came up short, do you have a link to the technical documentation?

      • generalpotato@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Haha! Thanks for the excellent write up. Yes, I recall Apple handling CSAM this way and went out of it’s way to try and convince users it was still a good idea, but still faced a lot of criticism for it.

        I doubt this bill will be as thorough which is why I was posing the question I asked. Apple could technically comply using some of the work it did but it’s sort of moot if things are end to end encrypted.

    • thann@kbin.social
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      Apple wants money for spying on their users, this bill would compel them to do that without the secret money their getting now, so their against it

      • generalpotato@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Nah, Apple is one of the few companies around that is big on privacy and uses privacy as a differentiator for it’s products. Look at some of the other responses, it’s more complex than them just wanting money. They already make a boat load of it.

    • Overzeetop@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Isn’t e2ee messaging intended to encrypt data transfer between devices, not provide global security? The default iCloud backups are still encrypted, but the key is stored/recoverable by Apple. This is the ideal sort of encryption for 99%* of the population. For the 1%, an option to forego the Apple stored key is an option.

      * yes, I made that number up. I will stand by it.

      If you did a random survey of users if they want a secure backup, e2ee encrypted, they will say yes. Overwhelmingly. But if you ask the question based on the outcome of a forgotten password: “If you lose or damage your phone and have forgotten your Apple account password, would you like all of your iCloud photos, messages, emails, contacts, and documents to be instantly destroyed and unrecoverable or should Apple be able to restore everything if you can prove it’s your account?” they will almost certainly choose the latter.

      • Marcy_Stella@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        If a user wants more protections there is “Advanced Data Protection” which fully encrypts all iCloud data however Apple knows you might lose your password or something so they require a recovery method before turning it on and make sure you know Apple won’t be able to help you if you lose your password and recovery method.

        Also for certain sensitive data such as health data or passwords full end to end encryption is enabled even in standard mode as it’s determined it’s worse for someone else to get access to that data then it is for you to lose it where as generally losing your photos are worse then someone else getting access to them.

      • Overzeetop@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        No, they’re encrypted. But Apple stores a copy of your key because most people forget their Apple password at some point (usually after they’ve wiped their phone and are setting up a new one) and need Apple to reset their password/re-enable their encryption key on the new device.

          • Marcy_Stella@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Also know that if you still want iCloud backups but want everything stored encrypted you can enable “Advanced data protection” which means that Apple doesn’t store the encryption key, you do need to setup a recovery method such as a recovery key or recovery contact however if you lose your device and recovery method your data is forever lost and Apple can’t help you like it can in standard data protection mode.

            Also note certain sensitive categories such a health and passwords are always encrypted as it’s determined it’s worse for someone else to get access to that data then it is for the user to lose it meanwhile generally a user losing their photos and messages if they forget their password is worse then if a hacker resets the password and gets access.

  • RandomBit@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    To me, this seems like such a transparent attempt to force the tech companies to have a backdoor. If they can scan for CSAM, they can scan (or copy) anything else the government wants.

    • 2xsaiko@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That’s very likely the actual goal. Stopping child abuse is only an excuse, one governments keep pulling out whenever they want to push anti privacy legislation. And it’s clear that this would do nothing to stop it either, because then abusers just wouldn’t use compromised services from big companies.

  • sebinspace@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    Well yeah, even if they aren’t good at it and are of hypocritical about it, appearing to believe the “what happens on iPhone stays on iPhone” philosophy is important to them.

    • Dick@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I wouldn’t say they’re hypocritical. I was in complete shock that they actually scrapped their iPhone scanning plans and now offer E2E for most of iCloud. They aren’t perfect but they definitely are better than most companies

      • sebinspace@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yeaaaaah maybe read up on some of the E2E stuff. Someone else at the top of this post posted a link to how it works.

        • Marcy_Stella@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          It’s generally a question about what’s best for the user, your general user would likely be more mad losing all their messages because they forgot their password then they are calmed by the fact that no one else can read the data. Same for photos and files, however for sensitive categories such as health and passwords they are always end to end encrypted as it’s determined it’s worse for anyone else to get that data then it is for the user to lose it.

          For anyone that truly cares to have complete encryption there is advanced data protection but for the general users the defaults are a good balance between security and ease of use.

          • sebinspace@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Yeah, all of that is true. But people who buy iPhones and assume, because the marketing said so, that they’re perfectly secure are worryingly ignorant.

        • asbestos@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          Yeah but his point stands. Here’s the summary:

          • If you’re syncing iMessages via iCloud but don’t use iCloud backup without Advanced Data Protection, it’s E2E
          • If you have iCloud backup enabled without Advanced Data Protection enabled, iMessage isn’t E2E encrypted
          • If you have Advanced Data Protection turned on for iCloud and you’re using iCloud backup, iMessage is E2E encrypted however you look at it.

          It’s generally good practice to not use iCloud backups but rather back it up yourself, however, most people don’t care enough.

    • ScoobyDoo27@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Man, I was hoping by moving away from Reddit I could move away from the pure hate apple for whatever reason. Show me how the other mobile OS is making things any better?

      • sebinspace@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I use an iPhone 12. I’m not going to defend Android because I don’t use it. I’m just not under the illusion of whatever Apple marketing distills complex problems down to, for better or worse, and being disillusioned isn’t “hate”, it’s awareness. Hate is something I reserve for my mother and father. This is just a goddamn phone.

        Moreover, being less bad than the other guy doesn’t make you not bad. Your whataboutism is weak tea.

        • ScoobyDoo27@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Except “less bad” is better than bad when you have 2 choices. I’d love to know where apple had blown it on their privacy record. And don’t try to bring up the CSAM shit because they walked back on that when they realized their userbase didn’t want it.

  • Untitled9999@kbin.social
    link
    fedilink
    arrow-up
    0
    arrow-down
    3
    ·
    1 year ago

    I think law enforcement should be able to intercept messages on services like WhatsApp, if someone is suspected of criminal activity.

    Is it right for criminals to be able to share child abuse material, or plans for terrorism, over something like WhatsApp? Without law enforcement being able to intercept these messages?

    I think law enforcement can break into your home if they have a court warrant, right? So why not allow the same thing with electronic communications?

    • conciselyverbose@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      It’s simple.

      If it’s possible for WhatsApp to intercept the communications of “bad people” for law enforcement, it’s fundamentally impossible for any communication to be private. The existence of a back door is automatically a gaping security flaw.

      There’s no such thing as “securely intercepting” messages. Either they’re secure against all actors or they’re not secure.

      • Untitled9999@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Maybe it’s worth having that security hole then. I think it’s a bit crazy that terrorists or child abusers can plan their crimes using WhatsApp without the police being able to intercept their messages.

        Also, if we’re able to contact our banks over the internet securely (and obviously the bank can still see everything about our accounts if they want, while criminals hopefully won’t be able to), then surely an equivalent should be possible for things like WhatsApp.

        • Marcy_Stella@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Ok so basic question you should be able to answer then how do you stop a foreign government from spying on other countries citizens? WhatsApp is not just a western world app. For example it’s used in Russia and the US and the UK so if Putin went to Meta and said “I want everything you have on Ex prime minister of the United Kingdom Boris Johnson and you can’t tell them” what reason would meta have to deny his request if the precedent by the UK is that this data needs to have a back door and if you say then the user should be notified then anyone under investigation is just not going to say anything incriminating and if it includes old messages then you risk the political espionage if anything is shared under the assumption everything is end to end encrypted. What about trade secrets, a corrupt government official could get a companies trade secrets for a business friend from anywhere in the world.

          There is a great video by Tom Scott that talks about this exact situation when the UK tried to break encryption 5 years ago but that failed because it wasn’t feasible from a security standpoint. There is also a great episode from Last Week Tonight talking about encryption and government attempts to get around it. We’ve seen from things like the Pegasus malware that repressive governments will use this little break in encryption to jail protestors and journalists and spy on their political rivals, having an official way will just make it easier.

        • cacheson@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Maybe it’s worth having that security hole then. I think it’s a bit crazy that terrorists or child abusers can plan their crimes using WhatsApp without the police being able to intercept their messages.

          Encryption exists. Terrorists and child abusers will use it whether WhatsApp or Apple or whoever implement it or not. Stopping those implementations is just denying privacy to regular users.

          Also, if we’re able to contact our banks over the internet securely (and obviously the bank can still see everything about our accounts if they want, while criminals hopefully won’t be able to), then surely an equivalent should be possible for things like WhatsApp.

          Law enforcement can’t eavesdrop on your encrypted connection to your bank. If they need to know about your banking activity, they rely on the bank reporting it to them.

    • iceonfire1@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I think law enforcement can break into your home if they have a court warrant, right? So why not allow the same thing with electronic communications?

      For me, the reason to disallow it is the potential for abuse. There were 864 search warrant applications across all federal agencies in 2022. In 2020, the FBI, specifically, issued 11504 warrants to Google, specifically, for geofencing data, specifically. Across all agencies there are probably millions of such “warrants” for data.

      It’s far easier to access your data than your house, so comparing physical and cybersecurity doesn’t really make sense.

      In general, criminals can easily just move to an uncompromised platform to do illegal stuff. But giving the govt easy access to messaging data allows for all kinds of dystopic suppression for regular people.

      • Marcy_Stella@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Generally tech companies now have agreements with law enforcement so they don’t have to deal with all the legal mumbo jumbo. Some data does still require a warrant such as if there is any protection laws(such as HIPAA protected data) or if the company considers it highly sensitive data but for a lot of data it’s easier to just hand it over then get legal involved.