Artificial Intelligence Voice Phone Call Scams

4
157

Artificial Intelligence Voice Phone Call Scams

 

by Mark Schwendau

Phone scammers are now starting to use artificial intelligence (AI) software to clone voices in a technique called voice synthesis or voice cloning. The population most commonly targeted by this phone scam is the less informed and more gullible elderly. As a member of that population, I was recently targeted by scammers using this ploy last summer.

I received a phone call one morning of a person who sounded like a graduate student of mine. I will call this student “John” for purposes of this article. The call went like this:

SCAMMER: Mark? Do you know who this is?

(The voice sounded exactly like the voice of my former student, John.)

ME: John?

SCAMMER: Yeah, this is. Hey, I got myself in a bit of trouble last night. I was at a wedding reception up here in Minnesota and got arrested for a DUI, and now I need bail money.

ME: Did you call your wife or father?

SCAMMER: No, they would be really mad at me. Listen, I only got 3 minutes to talk.

ME: What are you doing up in Minnesota?

SCAMMER: (Getting angry.) I told you, attending a wedding and reception!

ME: Which one of your cars were you driving when you got the DUI up there?

SCAMMER: Click

So the back story you need to know about how I conducted myself here is this:

First, the scammer asked if I knew who was calling as he/she was seeing if the voice impression was good enough to suck me into the scam.

Second, I had never had a student call me asking me for financial help like this over the phone and this was immediately suspicious. Anything involving money should be an immediate red flag.

Third, the only having 3 minutes to talk I knew as a red flag as I had worked as a vendor in the prison industry. He should have had 15 minutes at least.

Fourth, my graduate, John, is one of the few to call me regularly and give me updates about his work life (I was his vocational-technical professor) and family. I have met his family.

Fifth, John was very familiar to me as a student immigrant from Eastern Europe who was blind in one eye and scarred from a prior accident when he was hit by a drunk driver. I knew he did not drink both for this reason and religious reasons.

Sixth, I knew John drove an expensive sports car because he came from a family of money. I used this knowledge to bust the ploy of the scammer instantly.

What was most concerning to me is the person who perpetrated this fraudulent hoax had to have personal knowledge of John and myself as well as our friendship that developed after his graduation.

Here’s an overview of what it took to pull this scam off:

Step 1. Acquire AI voice software on the Internet and load it to your computer.

“The Top 10 Best AI Voice Generators 2024,” Dr. Alex Young:

Step 2. Collect audio recordings of a subject’s voice to create a believable voice clone for scam target. These recordings can be obtained from many public sources such as the audio of videos posted to social media of the subject or voice recordings from phone conversations with the subject in phishing calls.

Step 3. Create a voice synthesis model with the collected audio recordings.  These models can learn the patterns, intonations, and speech characteristics of a subject’s voice.

Step 4. After the model is created, scammers can input any text they want the cloned subject voice to say. The software of the newly created AI model then generates a synthetic voice that mimics the subject’s voice based from the patterns learned creating the model.

Step 5. This initial model voice clone is then fine-tuned by providing additional audio samples or techniques in a process known as transfer learning. This improves the accuracy and naturalness of the cloned voice. The idea here is to eliminate any breaks in the conversation or speech pattern.

Step 6. Scammers use the cloned voice of the subject on a target for a variety of malicious purposes in phone calls to deceive. In general the intent will be to gather personal information such as retirement account, medical insurance, and banking information. But, most commonly it will be to get money in some form of cash transfer, gift card purchase, or money order.

Voice cloning technology is still evolving, and while it may have some legitimate applications, it also raises concerns about abuse by cybercriminals and phone phishing scammers. Efforts are underway to develop countermeasures to detect and prevent voice cloning attacks.

Thankfully, some of the good people of the world are taking to the Internet and social media platforms to educate the unknowing public as per this Facebook short video reel here:

Conclusion:
Here are FOUR rules to blow the cover of most such scams I practice:
    1. NEVER do money transactions over the Internet or phone unless you are the one who initiates the transaction to a verifiable phone number or Internet URL.
    2. NEVER provide any personal information to anybody you communicate with on the phone or Internet. Assume they are a stranger that remains unvetted and unverified.
    3. ALWAYS try to have a secret passcode or verification phrase to validate who you are talking to, and use more than one such that they are random.

“What kind of car do I drive?”

“What is the name of our family pet?”

“What is the name of my spouse?”

“Did I ever have Covid?”

“Was I hospitalized?”

“Where do I/you work?”

    1. NEVER allow people you do not know to follow or friend you on social media platforms. They could be phishers out to collect personal information for such nefarious activities as voice cloning.

There are countless jokes and videos about “stupid criminals”. Generally, these scammers are pretty easily exposed so long as you don’t allow them to get you into a “fight or flight” scenario which is a basic primal response when we are suddenly under duress. Stay calm, stay suspicious, and most of all, have some fun with them!

“Which one of your cars were you driving when you got the DUI up there?”

LOL

Copyright © 2023 by Mark S. Schwendau

~~~

Mark Schwendau is a retired college technology educator and author of Illinois. He holds a BS degree in technology education and an MS degree in industrial management. He has had news articles published in online news journals such as Communities Digital News and Independent Sentinel. His opinions are his own, as assured by the First Amendment of the Constitution.


PowerInbox
5 2 votes
Article Rating
Subscribe
Notify of
guest

4 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Canadian Friend
Guest
Canadian Friend
1 month ago

Is actor Mark Ruffalo reading the Independent Sentinel or what ?

A day after I said the left will use AI to make fake videos of Trump doing bad things, Mark Ruffalo did EXACTLY THAT !!!

He made a fake video of Trump with young girls to brainwash people into believing Trump is a pedophile !!!

https://www.thegatewaypundit.com/2024/01/woke-actor-mark-ruffalo-shares-ai-generated-images/

Last edited 1 month ago by Canadian Friend
Canadian Friend
Guest
Canadian Friend
1 month ago

correction

they are photos not videos

and apparently he did not make them, someone else did, he only posted them

but my point still stands; the left will use AI technology as a weapon against non-liberals

liberals are PURE EVIL.

Canadian Friend
Guest
Canadian Friend
1 month ago

How long before bad people start using AI fake videos or audio and claim it is evidence of someone committing or planning to commit some crime or discussing the crime they committed in the past ?

If Trump is re-elected ( will take a miracle with the cheaters who are in charge now ) democrats and their accomplices in the DOJ, FBI and Media will eventually come up with exactly that, they will say they have audio ( maybe even video ) of Trump talking about some crime he did or is planning to do.

AI will do such a good job that it will be impossible to tell that it is a fake.

People will get fired, have their reputation destroyed or go to jail, all based on fake video and audio.

within 2 or 3 years it will start happening.

Anonymous
Guest
Anonymous
1 month ago

Already happening. Deep fake videos have been around since at least 2019.

They’re planning to debut a news station out in california with AI generated news castors reporting the news. 100% fake news. California, Go figure. Already happening in China