Hear a loved one’s cry for help? it could be a scam

- Advertisement -

Imagine that you have received a call from a loved one, who is pleading for your help. It is likely that you will be motivated to do whatever it takes to get the person out of danger.

But thanks to artificial intelligence, that call may not be what it seems.

- Advertisement -

The Washington Post reports that rapidly developing technology is making it possible for fraudsters to better mimic voices. Crooks use technology to trick people – often seniors – into believing that loved ones are in harm’s way.

According to the Federal Trade Commission, these so-called “impostor scams” were the second largest fraud in 2022, with a total loss of $2.6 billion.

- Advertisement -

The Post reports that fraudsters are becoming increasingly confident in their attempts to commit fraud:

“Advances in artificial intelligence have added a terrifying new layer, allowing bad actors to replicate a voice with audio samples of a few sentences. Powered by AI, a bunch of cheap online tools can convert an audio file into a voice. can translate into a replica of the text, allowing an impostor to ‘speak’ what they type.

- Advertisement -

As the Post reports, the software used in these scams analyzes multiple aspects of a person’s voice — including pitch and timbre — to create a convincing audio forgery. Fraudsters can find the audio samples they need to create their scams on social media sites like YouTube, in podcasts, and in videos people post on TikTok, Instagram, or Facebook.

The best way to avoid falling prey to this type of scam is to be vigilant. According to the post’s story, some of the things you can do are:

Putting the call on hold if it looks like you’re talking to a family member or friend. Then, call the person you’re allegedly talking to separately. Not providing assistance in the form of gift cards, which can be difficult to trace. Refused to send cash.

For more information on how to avoid falling victim to a scam, visit:

Source link

- Advertisement -

Recent Articles

Related Stories