AI deepfakes: How to be a discerning viewer

Expert explains how to tell the real from the fake

IZWAN ROZLIN
IZWAN ROZLIN
16 Jul 2024 10:11am
Photo for illustration purpose only. - Photo illustrated via Canva
Photo for illustration purpose only. - Photo illustrated via Canva
A
A
A

SHAH ALAM - Despite being mimicked from real sources, AI deepfake techniques still have weaknesses, according to Selangor Technical Skills Development Centre Chief Operating Officer Dr Muhiddin Arifin.

He said fraudulent actions using someone's voice and face can still be detected.

"This situation is still under control, and it's difficult for fraudsters to imitate 100 per cent of the original character because there will always be differences.

"AI requires time to improve its weaknesses and still struggles to overcome the authenticity of the original character," he said.

Muhiddin clarified that while detecting deepfakes can be complicated, we can become more discerning by paying attention to details and using several techniques.

"One way to identify whether media content is a deepfake is by looking for unnatural movements.

"Look for mismatches in facial expressions, blinking patterns, or body movements. Does the person's mouth movement seem out of sync with the audio and not align with the muscle movements around the person's lips?"

He added that deepfakes might struggle to replicate skin tones and hair textures perfectly, leading to blurred or odd video colors.

"Look for unusual blurriness around the face, neck, or hairline. Deepfakes might also be inconsistent in lighting or background details.

"Do light sources appear to change in the scene, or do background elements seem out of place?" he added.

Muhiddin also stressed the importance of paying attention to the quality and delivery of the audio.

"Does the person speak like themselves, or are there jerky or unnatural speech patterns that could raise suspicions?"

"In short, deepfakes are media manipulated through video or images created using AI platforms.

"They can make someone appear to say or do something they never did in real life," he said.