- Deepfake calls claim thousands of victims, reports the report
- Up to 10% of spam calls are fraudulent
- The best scam for British victims was HMRC's false calls
Ai Deepfake fraud calls are dominating the panorama of the scam, and are costing British consumers millions of pounds.
A new Hiya report has detailed the increase in the scams of voice and the voice of Deepfake in the United Kingdom and abroad, pointing out how the emergence of generativai means that deep defenders are more convincing than ever, and the attackers can also take advantage of them more frequently, even in terms of companies and executives of C -SUITE, making the deep 2024 also be dangerous.
The AI reduces the barriers so that criminals commit fraud, and causes the victims of scamming to be easier, faster and more effective, and the so -called average successful fraudulent costs the British victim £ 595, so this is what the report reveals.
Billions of calls
Hiya says he scored 11.3 billion global suspicious spam calls only in the fourth quarter in 2024, – 123 million calls per day. Of these, 22% were marked as annoying calls, and 9% were fraudulent, which may not sound like much, but it means that one in ten unexpected calls could cost it hundreds if you are not careful.
A survey confirmed that 26% of the United Kingdom residents have received calls from Voice Deepfake in the last 12 months, and from them, 40% reported being cheated, 35% reported having lost money and 32% had stolen personal information.
The theme of these was mainly financial and banking scams, which represent 11% of the deep, closely followed by insurance, vacation reserves, delivery service supplications (all 8% each).
In general, Global Anti -scam Alliance estimates that an amazing amount of $ 1.03 billion in scams in 2024 were lost, and the defects are slowly becoming one of the tools of election for criminals.
“As we reflect on the last quarter of 2024, it is evident that fraud with AI is becoming more sophisticated than ever, raising a serious threat to consumers and companies,” said Alex Algard, CEO of Hiya.
The best scam in the United Kingdom was a supplant of the income and customs of His Majesty (HMRC), in which the victims are told that a criminal case is being drawn for them by fiscal fraud, and an arrest warrant has even been issued to his name.
This type of fraud aims to panic the victims, convincing them that they are in trouble, urging them to deliver bank data, financial information or personal identification information (PII).
It is important to keep in mind that even if the “unique” to which a scammer has access is his personal data, this still leaves him at a serious risk of identity theft, since criminals will obtain loans, credit cards or bank accounts in his name.
How to protect yourself
The report occurs shortly after another recent study showed when 2000 people were content from Deepfake, only two of them managed to obtain a perfect score, so everyone should be on guard.
Deep defenders are essentially social engineering scams: the natural evolution of Phishing attacks, which are often passed through banks, popular services, colleagues or even relatives to try to deceive victims to click malicious links, scan dangerous QR codes or deliver their personal details.
However, voice and video defects are possibly more dangerous, since they can be seriously convincing. We recommend establishing a safe word with your family and close friends (anyone who can call it with an emergency), so you can be sure you are talking to who you think.
Out of friends and family, be very careful with any call from someone who claims to be their bank, a software company, or any company with services that you use regularly. If your 'bank' or hmrc calls, look for your number, call it and request that you transfer it to the same person.
Do not give your information to someone by phone and be sure to change your passwords regularly and keep passwords for each site containing confidential information. If you need advice, we have compiled a list of the best tricks to create a safe password.