Criminals use artificial intelligence to imitate voices and defraud people

Artificial intelligence has been holding a very controversial status over the last year. It has been surrounded by different debates regarding the ethical component of its use, especially in jobs and professional environments. However, this was barely the tip of the iceberg, since concerns about the criminal use of AI have been growing stronger.

Unfortunately, robbers have been utilizing phone calls or messages to deceive people and eventually fraud them. Generally, they pretend to be a close friend or a relative and ask for money over the phone call. They say they need it to get out of a complicated situation or solve a problem.

As a result, the Federal Trade Commission (FTC) president, Lina Khan, warned that is very important to pay attention to the development of artificial intelligence, as it could lead to an increase in defraud cases. Khan stated that criminals might use recordings of people on social media to “clone” their voices through AI programs. Unfortunately, accessing text-to-speech applications is relatively easy nowadays.

in an attempt to offer users help to combat this issue, the FTC published an article with useful advice to avoid being affected by fraudulent phone calls related to voice cloning or voice imitation.

In the first place, one must check the phone number and contact the person who supposedly needs help to verify their identity. Given the case they do not answer, one can always talk to another relative or friend to ask them about their whereabouts. Additionally, the article also indicated that criminals usually ask for money through methods that make it hard for people to get it back, such as bank transfers, crypto money, or gift cards. 

Carlos Gaviria

Redactor en Drop The Info desde 2023. Graduado como licenciado en Inglés/Español en la Universidad Pontificia Bolivariana. Instagram - Linkedin

Disqus Comments Loading...
Published by
Carlos Gaviria