- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
“A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough.” Mom wary about answering calls for fear voice will be cloned for future virtual kidnapping.
Unless I know who you are, I’m not answering your call. 90% of the time it’s a robot and the other 10% can leave a voicemail.
Isn’t a voicemail worse for detecting deepfakes because it doesn’t require it to dynamically listen and respond?
I’m not personally concerned about getting duped by a deep fake. I just don’t want to talk to any robots, solicitors, or scammers.
I have had calls from similar numbers to my own, and seen caller ID’s for people that aren’t contacts. I haven’t picked them up, but the temptation was there to do so.
You might know the number. My wife used to live in Kenya and renamed her “Mum”/“Dad” contacts after they once got a call from her stolen phone saying she’d been arrested and they needed to send money for bail.
I’ll go one farther - unless it’s my doc, my wife, or my boss, I’m neither answering the call nor listening to the voicemail. That’s what easily skimmable voicemail transcription is for…
I don’t love the privacy implications of transcribed voicemail, ofc, but it’s better for my own privacy/threat model than answering the phone to robots, scammers, and etc. It’s also a hell of a lot better for my mental health, vs listening to them.
Same. Life is much better this way
Real kidnappers will not be happy about this as deepfake becomes more prevalent and calls for ransom gets ignored more and more. Do they have union that can go on strike to raise awareness about this?
The real victims here
“Improving” is not a word I would use in this case.
Like that old Invader Zim line.
“You made the fires worse!”
“Worse? Or better?”
Right? Reads like… hey kidnappers, parents hate this one simple trick
This is why code words are important.
As awful as it sounds this needs to be setup between family members. Agree on a phrase or code word to check and make sure they are who they say they are. This is already common when it comes to alarm system monitoring companies, got to make sure the intruder isn’t the one answering the phone and telling them false alarm.
Techbros be like “but what if it can be used to resurrect dead actors?”.
Not even necessarily dead actors. They used AI to bring young Luke Skywalker back in The Book of Boba Fett. And it was not great, but it was serviceable. Now give it 10 years.
Ohh, well that totally makes all this worthwhile then.
Sorry, I wasn’t trying to suggest that at all. I think it’s appalling.
You could make Iulius Caesar sing Gangnam Style, if DNA Analysis AI and deepfake voice AI work both towards this common noble goal.
The ‘hostage-taker’ will never be able to duplicate my family’s grammar and sentence structure quirks, so I won’t care how it “sounds”…
“we have Face/Off at home”
I’m pro-AI but any technology that can lead to the creation of deepfakes must be explicitly banned.
Naturally, we’re already talking about criminals but you combat this issue the same way you combat school shootings. Banning the root of the issue and actively persecuting anyone who dares acquire it illegally.
EDIT: The victims wouldn’t have fallen to this deepfake scam if they had their own deepfake scam. Scam the criminal before he scams you!!!
Would it even be possible to ban? Every military in the world wants this technology.
Well, the military has tanks too, should we go around selling tanks to the general population?
You can’t download a tank.
One leak would make this deepfake software publicly available to all bad actors anyways regardless of any ban.
Also a global ban seems unlikely. You can protect Americans from bullets coming from enemy territory. You can’t protect them from viral deepfaked posts.
You can’t download a tank.
AHAHAHHAHAHAHHAA
OK OK I give up. I can’t possibly fight against THIS LEVEL OF LOGIC.
You can’t protect them from viral deepfaked posts.
In other words “Buuuut Ruussssai will deeeepfakeee our politiiciiians we must be able to deepfake them too”.
Drop the cold war mentality please, for the sake of all of us.
Going through your comment history it’s obvious your a China apologist. Seems like the trolls spared no effort to migrate to Lemmy as we
Removed by mod
👏👏
I am not talking about America vs Russia. I am talking about the fact, that Internet censorship doesn’t really work.
Chinese citizens can easily bypass Internet restrictions, implemented by the second richest economy in the world, just through a simple VPN.
The only way for a country to implement an effective deepfake ban, would be disconnecting from the global Internet entirely and let computers only connect to a government controlled intranet, like in North Korea.
And even then, with a USB stick with a copy of the illegal deepfake software, a criminal can still easily do, what this article is talking about.
In the US it is legal to own a tank.
Removed by mod
Is this what gaslighting is?
Perhaps there should be government controlled licenses for some technologies, like for gun ownership? Although there’s probably all sorts of ways that be circumvented. Not sure how best to control this though.
*edit wow thanks, nice to know what sort of community this is. Nothing in the responses so far have told me anything I didn’t already know, and I did point out that it would be circumvented. I don’t see any other ideas though. Maybe with that sort of negative defeatist response we’ll never even try. Fuck it right, let’s just watch the world burn. /s
Ah yes because making something illegal stops criminals from using it. Problem solved.
Perhaps there should be government controlled licenses for some technologies
Basically impossible.
It’s against the ToS to use tools like teamviewer to run user support scams, for example, but people do it anyway. You can’t legislate against criminals, because if they’re already breaking the law, why would they care about another law?
The only way forward here is enforcement. There needs to be better coordination between governments to track down and prosecute those running the scams. There’s been a lot of pressure on India, for example, to clean up their act with their very lax cybercrime enforcement, but it’s very much an uphill battle.
Not really comparable to guns, making it harder to get a physical object is much different from preventing people from downloading software. Even 3d printed guns require equipment and knowledge to make use of the download.