These are “GAN faces” that are created using artificial intelligence that performs fraud

The GAN face (left) is generated by artificial intelligence. (ESET)

with a twist Intelligence industrialCybercriminals have begun to use these capabilities to improve their strategies and carry out hard-to-detect cyberattacks, particularly those that produce (can produce content) audio, video, and images.

Now a scam based on (or powered by) using “GAN faces“. Generative adversarial network (GAN) are artificial intelligence programs that can create digital content that is close to reality, but does not actually exist.

Face pictures generated by this type of file programs She doesn’t get along with a real person and is completely fictional. However, it is its resemblance to a real photo that causes problems when trying to identify it and can be used by delinquent to perform differently Tricks.

It may interest you: The most sophisticated aspect of AI can pick your favorite song on Spotify

According to the cyber security company ESETThere fifty% Chances of confusing the fake face generated by the GAN with the real one, which makes this strategy ideal for creating fake profiles on social networks. The goal of these types of scams is detection in the first place Information secret to steal money Of the savings accounts are the most common.

Cybercriminals can produce realistic human faces using artificial intelligence. (Ibero Leon)

To do this, they can rely on various methods such as deceptive love (which ends with conversations asking for “financial help”), phishing (stealing a real person’s identity to send non-existent customer service messages), among others.

Another danger involved with this type of technology It is the spread of false information. These types of AI-generated faces have previously been used to fake videos or audio recordings of national authorities supposedly expressing themselves in an inappropriate or criticizing manner. This can jeopardize the good relationships between countries If it is widespread.

It may interest you: Cybercriminals use artificial intelligence to crack passwords

Identity theft is also possible with this technology as it can reproduce the pictures Much like that of a famous artist or personality. this methodology It will even have the ability to circumvent some facial recognition systems for user authentication. Access for some devices. Although some companies are working to prevent this from happening, there are currently no features available to prevent such cases.

See also  NASA: Voyager spacecraft guidance system readings reveal 'impossible data'

To avoid being a victim to use these the pictures And in order to detect them as soon as possible, each user should take into account these recommendations:

“Love bombing” is a method used by cybercriminals that consists of expressing excessive love or affection in a short period of time to obtain “financial benefits” and steal money from victims. (national criminal lawyers)

– Check the source from which picture To make sure that its origin is real and no additional software was used, you can save a file Photography Then do an image search on Google to see if it was pulled from an existing site. The same can be done if you are sending a GIF or video by any means.

It may interest you: Why the use of AI will be critical to digital commerce

– Maintain security systems updated Computers desktop computers, laptops, cell phone and any other electronic device. This will allow the antivirus you have installed to work actively in detection and elimination.

– Don’t participate Information Confidentiality is the most practical advice to avoid falling victim to a cyberattack that resorts to this type of practice. Although an account may seem authentic and trustworthy for sharing banking information, that does not mean that it will be or that it does not have ulterior interests after obtaining data necessary.

Lovell Loxley

"Alcohol buff. Troublemaker. Introvert. Student. Social media lover. Web ninja. Bacon fan. Reader."

Leave a Reply

Your email address will not be published.

Back to top