Fraudsters are continuously enhancing their tactics, making scams increasingly sophisticated and convincing. Cybersecurity experts have highlighted that one of the latest techniques used by scammers involves audio deepfakes, which utilize artificial intelligence to mimic voices and even facial expressions, thereby making fraudulent activities appear more authentic.

Irene Corpuz, a founding partner and board member at Women in Cybersecurity Middle East, recently discussed a case in May where a British engineering firm in Hong Kong fell victim to a scam involving an AI-generated video call, resulting in a loss of approximately HK$200 million. Corpuz warned that scammers often engage victims in phone conversations specifically to record their voices for future scams, and this tactic can also be employed during Zoom meetings with multiple participants. She emphasized that hearing a familiar voice or seeing a video of a friend or loved one significantly increases the believability of a scam.

To protect against audio deepfakes, Corpuz advises the public to be cautious with the words they use, particularly in response to calls from unknown numbers. She suggests avoiding 'yes' or 'no' answers to such calls, as scammers might use these responses to confirm fraudulent transactions. Corpuz also noted that scammers may use verification tactics to appear legitimate, such as providing partial information from an Emirates ID, aiming to trick the victim into completing the details.

JD Ackley, CEO of Raizor, a company specializing in conversational AI, advises being vigilant against unsolicited calls and avoiding payment requests in unusual forms, such as gift cards or money transfers. He recommends asking for a callback number to verify the legitimacy of the caller. Barney Almazar, director of the corporate-commercial department at Gulf Law, emphasizes the importance of education and awareness in combating scams, highlighting the strict measures under the UAE Cybercrime Law to address such issues.