• Zombo@partizle.comOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Yes, there is potential for a slippery slope. And any filtering technology could be used for nefarious purposes. But this strikes me as pretty far from the slope and the purpose is clearly a good one. Remember you can always just turn it off.

      • Zombo@partizle.comOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That’s kind of the risk with any technology. And I admit, it is the most likely way we lose control: someone will ask, “why does Apple let you turn off the child porn filter?” and the answers may not be enough for lawmakers or an angry mob.

        That the same could be said of a great many tools that filter bad content, from spam filtering to DDOS filtering. Should a technology not be available to consumers based on a hypothetical? That’s just as bad.

        If a technology exists to filter content I don’t want to see, who are you to tell me Apple shouldn’t sell me a device with that technology I want?

  • dragonfornicator@partizle.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    So more scanning of arbitrary data for the sake of sanctimonious reasons, and definitely not for the sake of collecting data. I’m curious what is send where regarding those scans. There has been a scandal regarding amazon and those ring cameras. That software might run on the device, but whatever detection it’s using is bound to make mistakes, and who sees the results? Is everything fully automated? Or human verified? I don’t know which one would concern me more. Not even talking about young people taking photos of their body for various reasons. And just because it runs on your device does not necessarily mean that whatever is scanned is never sent anywhere. It just means that the scanning happens on your device.

    Quite frankly, if it wasn’t horrible i’d find the idea of some secret ring inside of apple using that CSAM-detection to collect material to sell on the dark net rather interesting. Might make an interesting plot for a thriller or novel…

      • dragonfornicator@partizle.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s something that not talked about, which, given our data-obsessed world, i interpret as “we just do it by default (because nobody will complain, it’s normal, yada yada)”.

        Besides, it’s stated that the scanning itself does only happen on your device. If you scan locally for illegal stuff, it’s not really far fetched that someone gets informed about someone having, for example, CSAM on their device. Why else would you scan for it? So at the very least, that information is collected somewhere.

        • bouncing@partizle.comM
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I think your threat model for this is wrong.

          First of all, understand how it works: it’s a local feature that uses image recognition to identify nudity. The idea is, if someone sends you a dick pick (or worse, CSAM), you don’t have to view it to know what it is. That’s been an option on the accounts of minors for some time now and it is legitimately a useful feature.

          Now they’re adding it as an option to adult accounts and letting third party developers add it to their apps.

          The threat that suddenly they’re going to send the scanning results to corporate without telling anyone seems unlikely. It would be a huge liability to do so and have no real benefits for them.

          But the threat is this: with this technology available, there will be pressure to make it not optional (“Why does Apple let you disable the child porn filter — wtf?”). If they bend to that pressure then why not introduce filters for other illegal content. Why not filter for comments criticizing the CCP in China or content that infringes on copyright?

          Having a “dick pick filter” is a useful technology and I know some people who would love to have it. That doesn’t mean the technology could be misused for nefarious purposes.

          • dragonfornicator@partizle.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I am aware that it’s local, i just assumed it would also call home.

            My threat model here is based on cases like this: https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation

            And yes, i did see it as a privacy issue, not a censorship one. Inevitably, if this finds the pressure to expand it towards other content, it could be a problem comparable to the “Article 13” Europe was, or is, facing.

            Generally, blocking specific types of content is a valid option to have. As long as it is an option, and the user knows it is an option. I just distrust it coming from the likes of google or apple.

            • theonlykl@partizle.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I would honestly find it very difficult to believe that there wasn’t going to be some telemetry, data / etc sent back to the mothership. I know in the marketing realm Apple caters towards “privacy”, but who’s really validating those claims.

              Granted…I’m also very tin-foil-hatty about my data and retain it all locally with offsite backups. I tore down my Google Drive / cloud data about 2-years ago.