All about fake profiles: How to recognize them

All about fake profiles: How to recognize them

You can buy the profile of a fake person for just a few euros. You adjust the appearance according to what you need. Age, gender, even ethnicity is up to you. Thousands of simulated profiles are circulating on the internet and behind them are all kinds of people who wish to remain anonymous. But if you pay close attention, you can still spot a fake profile.

The Facebook group Convoy to Ottawa 2022 is one of the social media initiatives that early this year called for the Canadian capital to be occupied with trucks in protest against the corona measures. One of the administrators of this Facebook group is Jason Shoeaddict. Probably a pseudonym, which is strange for someone who claims to work for National Geographic TV, an organization of which there is no trace whatsoever.

Even weirder is the fact that Shoeaddict’s profile picture also shows up in another profile, Pavel Zakirov, someone who poses as the recruitment manager of a Russian website for freelancers. We have no idea whether or not Shoeaddict is a Russian troll inciting Canadian activists. What we do know in the meantime is that the photos of Jason and Pavel are the product of deepfake technology. Although he looks like your neighbor, there is no one in the world walking around with the head you see in the profile pictures.

Left Pavel Zakirov, right Jason Shoeaddict.

GAN

Fake profiles have long been no exception on social media, dating sites and recruitment platforms. They are used to give the impression that an organization has a large following. The fake profiles not only use fake names, they also like to put faces on them for the sake of credibility. You can of course steal a photo of an existing person, but nowadays you quickly run into the lamp. That is why people prefer to use a deepfake photo.

These photos are compiled by a GAN (Generative Adversarial Network). Such a generatively contradictory network is a class of machine learning algorithms, where two neural networks are constantly competing with each other. Such a GAN works on the one hand with a neural network for the generator and a network for the discriminator.

The generator gets the real images that it tries to mimic as closely as possible. The discriminator learns to distinguish between forged images and the originals. StyleGAN is a generatively contradictory network made available as a resource by Nvidia researchers three years ago.

All these photos were generated by a GAN.

One person per click

If you enter a huge amount of images into such a network, the system can create new images of a similar type. Meanwhile, GANs are also used to independently generate art, such as panoramas of gigantic cities, idyllic sunsets or Escherian structures, but also to produce music. In time, GANs will be able to allow images of people to speak and move in a natural way.

Feed a GAN a huge amount of portraits and the tool starts to compose new faces on its own with person A’s nose, B’s lips, C’s eyebrows and so on. On the website This Person Does Not Exist by Phillip Wang, new faces are constantly being generated from non-existent persons. You can download any photo. By the way, for the lovers of cats there is a parody of this site: www.thiscatdoesnotexist.com.

With every click of the mouse you create a new person.

Communication weapon

What once started as an innovative concept is now being used as a communication weapon. Shady things happen with fake profiles. For example, according to William Evanina, director of the US National Counterintelligence and Security Center, fake profiles on LinkedIn are a risk-free way to recruit people for espionage. “Instead of sending spies to a parking garage in the US to recruit a target, it’s more efficient to sit in front of a computer in Shanghai and send friend requests.”

Fake profiles are used to influence public opinion. As early as 2016, Russian trolls tried to disrupt the political debate in the US elections. Since then, Twitter has been actively tracking down fake accounts to suspend them. This effort reduced the number of users of this platform by nine million.

The best-known example in the low countries is Eduard van Vlugtenstahl, who is currently still doing the rounds on Twitter as Mr. Panini. It was he who tweeted in the midst of the Covid crisis: “Had a nasty post today. My ex-father-in-law died suddenly after a second shot. Please don’t ask about this. Draw your own conclusions.”

The message was immediately retweeted by a Dutch politician. After a while it became clear that Van Vlugtenstahl does not exist at all and that his profile picture was generated by a GAN.

Fake profiles are used for deception.

Learn to detect

Actually, producing photos of non-existent people is a by-product. Originally, the purpose of the AI ​​face generator was to recognize fake faces and faces in general. It’s frightening how realistic the results look.

Although deepfakes are getting better, a neural network still makes mistakes. You can discover them by looking very closely, because the visual processing systems of humans are still stronger than those of the computer. You can recognize a fake by a hair that comes out of the forehead in a strange way, by the edge of a spectacle frame that suddenly disappears, by artificial spots in the background, by repeated incisors…

These visual glitches are called glitches. You can train yourself to spot glitches on Which Face Is Real, a project by two scientists at the University of Washington. You will always see a photo of This Person Does Not Exist next to a real photo. It’s up to you to decide which photo shows someone of flesh and blood.

You’ll see that it won’t be long before you get a score of 9 out of 10 in recognizing fake photos. Also on Deepfact.3duniverse you can sharpen your skills as a deepfake detective in a quiz.

This one was easy, the hat betrayed the deepfake.

Watch the pupils

The people in the fake photos don’t look perfect, they’re not movie stars. It could be colleagues. Yet you often see by the eyes that it is a generated photo. Regardless of the position of the head, the pupils are almost always in the same place. If you mark the eyes with guide lines in an image editing program such as Photoshop, you notice how unreal the pupils are always in the same place.

There are also regular symmetry errors in the ears or someone has one smaller shoulder and you see half a head or a piece of someone else’s arm in the background. There is a control site: www.sensity.ai. It used to be free, but now it only works via a payment model. This service can indicate with a degree of certainty whether a photo is fake.

These are three photos of This Person Does Not Exist, the pupils are in the same place.

Generate yourself

A fake profile is not always malicious. Sometimes it concerns people who want to get involved in political or social discussions, without anyone being able to hold them accountable. Yet it remains a questionable form of anonymity, because these people do not admit that they are anonymous.

For certain projects and presentations, designers need photos of people that are free, diverse, and copyright-free. That’s why Icons8 has on Generated Photos compiled a photo bank containing 2,681,746 people, all of which are generated by AI. You are allowed to download those photos without copyright. And recently they launched a new service with AI-generated pictures of full-bodied people that don’t exist — Generated Humans. It’s a collection of 100,000 AI generated models that also can be used without copyright.

Do you choose Browse photos, then you can always determine the background, gender, age, ethnicity, emotion and hair length of the first thirty photos in the grid. The online tool responds immediately to all photos when you move the sliders.

If you want to download a jpg of 1024 × 1024 pixels, it will cost you $2.99 ​​per photo. Or you pay $19.99 to get 15 photos per month. As mentioned, you can also download the photo for free, but then you may not use it for commercial purposes and the resolution is limited to 512 × 512.

The settings are applied to all portrait photos.

Face generator

Instead of browsing through already generated photos, you can also create faces yourself in the face generator. You then generate one portrait yourself based on the settings in the left bar. You can edit the attributes of this person in detail. If you want to turn the head a little to the left or up, or if you want the person to put on sunglasses or reading glasses, that’s possible. And again you determine the age, the emotion, the skin color and the hair color. It is even possible to apply make-up to the eyes and lips. When you are ready, you can place the photo in the shopping cart.

You can also have one image generated yourself.

Compare eye reflection

Not only are the deepfakes getting better, the tools to unmask them are also becoming more effective. A promising project is currently being developed by computer scientists at the University at Buffalo.

This is a new AI tool that already recognizes GAN photos with a precision of 94%. The system exposes fakes by analyzing the reflections of the eye’s corneas and comparing left and right. In a photo of a real person, the reflection in both eyes will be the same because the eyes see the same thing. The deepfake images synthesized by GANs fail to produce the exact same reflections. The deepfakes often show inconsistencies, such as different geometric shapes or locations of reflections that don’t match.

The study is currently completed, but the app is not yet available to the public. The system proves to be particularly efficient with images from the This Person Does Not Exist website, as these are always portrait photos showing both eyes. If one eye is not visible, the method will not work.

A GAN photo cannot (yet) generate natural eye reflection.

Reverse search

Going back to our very first example: Jason Shoeaddict. How did fact-checkers find out that the profile picture of a key figure in the Facebook group is identical to that of the recruiter of a Russian freelance agency? It started with healthy suspicion. A reverse photo search revealed the double use of the profile photo.

One of the most popular tools for such a reverse image search is TinEye, a service that also has an extension for the browsers Chrome, Firefox, Edge and Opera. You drag the suspicious photo into the search box or paste the web address into the search box.

Once you have installed the extension, right click on the image and choose the command Search Image on TinEye. With the function TinEye Compare you can quickly switch back and forth between the search and result images for even better comparison.

TinEye finds out where else that photo can be seen on the web.

MatchEngine

TinEye also has a so-called MatchEngine which, given the payment model, is mainly intended for companies. By subscribing to this, you will receive your own web tool. With this, TinEye finds duplicate and near-duplicate photos. It also includes specialist features, such as fraud detection for finding damage photos that have been submitted multiple times for insurance claims, for example.

MatchEngine supports visual inspection and image verification for dating and social media platforms. TinEye shows the most modified, oldest and newest versions of a photo. This allows you to see when the photo was first posted and when it was further used and modified.

TinEye’s MatchEngine also finds ‘modified’ photos.

Recent Articles

Related Stories

Stay on op - Ge the daily news in your inbox