Brief note: Soros-funded harridans are still trying to cancel Revolver. We are extremely grateful and fortunate to be supported by our generous readership. Subscribers and Donors help Revolver weather any cancel culture storm. Buy a $49 per year Subscription for yourself and for that special someone, and if you are able and willing to give more, don’t hesitate to make a recurring monthly donation — whether it’s $1 or $1,000, every bit helps. You can also now easily give the gift of a Revolver ad-free Subscription. Simply go to the Subscribe page and check the “gift” option. Don’t be a cheap date! — make it an annual subscription.
There’s a “Tom Cruise” account on TikTok that will freak you out. It looks like Tom, it sounds like Tom, but it’s not Tom.
It’s AI technology and it’s taking Hollywood by storm. You can’t tell who’s real and who’s fake anymore.
Back in the old days, the “voice” would give it away. However, not today – thanks to AI voice technology, it’s getting harder and harder to tell fact from fiction.
@millionaire_mansions #deeptomcruise #milesfisher #richguy #millionaire_lifestyle1 #luxurylife
Another example of this incredible voice technology is this deep fake Joe Rogan and Jordan Peterson clip talking about a dancing gorilla.
@ze1tur DISCLAIMER: THIS IS AI GENERATED #joeroganpodcast #jordanpeterson #ai #aigenerated #deepfake #voicesynthesizer
And there’s no denying that some of this technology is getting rather sophisticated and creative.
@news.com.au What’s reality and what’s fake ? #ai #deepfake #aivoice #voiceeffects #leonardodicaprio #techtok #fyp
But what happens when the technology goes from harmless celebrity impersonators to something a lot more serious?
Well, we’re actually finding that out right now, thanks to a disturbing and very dangerous new telephone scam.
A popular past phone scam involved receiving calls from a person claiming to be a “police officer” who was demanding money in order to keep one of your family members out of trouble.
Many people understandably fell for that, but things have gotten a lot worse thanks to AI voice technology.
Federal regulators are now warning that you could receive a “panicked” call from someone who sounds exactly like a relative or family member frantically begging for money.
Wanting to help a trusted loved one in need, you quickly wire the money – only to discover that familiar voice on the other end was nothing more than an AI deep fake.
How do these scammers do it? They simply clone a voice.
The Federal Trade Commission issued a consumer alert this week urging people to be vigilant for calls using voice clones generated by artificial intelligence, one of the latest techniques used by criminals hoping to swindle people out of money.
The only thing scammers need to clone a voice is a short audio clip of their targets voice… which is something that’s easily obtained online with a few clicks.
Once they have the clip, they use a voice-cloning program and when the scammer calls you, he or she will sound just like your loved one.
And these scammers are using this technology to not only terrify average people, they’re also using it to steal massive amounts of money.
In 2019, scammers impersonating the boss of a U.K.-based energy firm CEO demanded $243,000. A bank manager in Hong Kong was fooled by someone using voice-cloning technology into making hefty transfers in early 2020. And at least eight senior citizens in Canada lost a combined $200,000 earlier this year in an apparent voice-cloning scam.
So, what should you do if you receive a panicked call from a loved one begging for a wire transfer, gift cards or some other sketchy money transfer?
Well, FTC officials suggest you immediately hang up and call the person directly in order to verify the story.
Scary stuff. This is what happens when technology falls into the wrong hands.
Be careful out there. Not everything is as it seems…
PLEASE SUPPORT REVOLVER NEWS — Donate HERE
Subscribe to ad-free and ditch the ads… just $49 per year or $5 per month…
CHECK OUT THE NEWS FEED — FOLLOW US ON GAB — GETTR — TRUTH SOCIAL — TWITTER
Join the Discussion