ASI

Mom! They found him – but it was ai “: The Real Horror Movie” played in Kansas

Terrible call. Frantic 911 report. Police race to stop what they thought was a kidnapping – only to learn it was all a hoax.

Such was the case recently in Lawrence, Kan., where a woman picked up a voicemail to find out that a voicemail that sounded like her mother was in trouble.

The voice was generated by AI, it's actually a lie. And suddenly, it wasn't the plot of a crime novel – it was real life.

The voice on the other end “sounded like his mother,” police said, matching the tone, breakdown, and even emotional state.

Everything feels like scammers have taken the sound of the community (perhaps from social media or voicemail greetings), fed it with some voice-cloning ai, and watched the world.

So the woman called 911; The police tracked that number and pulled over the car – only to find: no kidnapping. A visual threat is designed to trick people's senses.

This is not the first time something like this has happened. With just a snippet of sound, today's artificial intelligence can produce the dulcet tones of Walter Cronkite or, Barack Obama – regardless of whether the former president has ever felt the depth of using deep things to manipulate new and convincing things.

One recent report by a security firm found that about 70 percent of the time, people had trouble distinguishing a simulated voice from the real thing.

And this is not just about one pranks and small scams. Scammers have sent these tools to include government officials, dupe victims into large sums, or injured friends and family members in emotionally charged situations.

Upshot: A new type of deception that is harder to spot – and easier to trade – than any in recent memory.

Unfortunately, trust easily becomes a weapon. When your ear – and your emotional response – buys what they hear, even the lowest checks can disappear. Victims often don't realize the call is a scam until it's too late.

So what can you do when you get a call that feels “too real”? Experts suggest small, but important safety nets: the family name before the “protected name”, “check by calling your loved ones with a known number and not the one I called you, or ask only practical questions.

OK, so it's an old school phone check, but in the age of AI that can reproduce tone, laughter and sadness – it could be just the ticket to keep you safe.

The case of the line is mainly a wake-up call. As AI learns to mimic our voices, fraud has gotten much, much worse.

It's not just about fake emails and clicking on phishing links, either – now you're listening to your mother's voice on the phone, and you want with every atom of your being to believe that something scary hasn't happened.

That is cold. And it means we all have to keep coming forward – with doubt, validation and a happy dose of disbelief.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button