Meta Platforms Inc., the parent company of Facebook, Instagram, and WhatsApp, announced the removal of 63,000 accounts connected to the infamous “Yahoo Boys” scam group in its Q1 2024 Adversarial Threat Report. These accounts were involved in financial sextortion scams and distributing blackmail scripts.
Over the past few weeks, Meta identified and disabled these accounts through advanced technical signals and comprehensive investigations, bolstering its automated detection systems. A smaller network of 2,500 accounts, linked to approximately 20 individuals, specifically targeted adult men in the United States using fake identities.
“Financial sextortion is a borderless crime, driven by the increasing activity of Yahoo Boys, loosely organized cybercriminals primarily operating from Nigeria and specializing in various scams,” Meta stated. The company also removed Facebook accounts, Pages, and groups run by Yahoo Boys that attempted to recruit and train new scammers.
During the investigation, Meta discovered that most scammers’ attempts were unsuccessful, although some targeted minors. These cases were reported to the National Center for Missing and Exploited Children (NCMEC). Meta shared information with other tech companies through the Tech Coalition’s Lantern program to curb these scams across platforms.
Additionally, Meta removed about 7,200 assets in Nigeria, including 1,300 Facebook accounts, 200 pages, and 5,700 groups providing scam-related resources, such as scripts, guides, and links to photos for creating fake accounts. Since the disruption, Meta’s systems have been actively blocking attempts from these groups to return, continually improving detection capabilities.
Meta has been collaborating with law enforcement, supporting investigations and prosecutions by responding to legal requests and alerting authorities to imminent threats. The company’s efforts extend beyond account removal, funding and supporting NCMEC and the International Justice Mission in running Project Boost, a program that trains law enforcement agencies worldwide on processing and acting on NCMEC reports. Several training sessions have been conducted, including in Nigeria and Côte d’Ivoire, with the latest session held last month.
To further protect users, especially teens, Meta has implemented stricter messaging settings for users under 16 (under 18 in certain countries) and displays safety notices to encourage cautious online behavior.
Last week, Meta was fined $220 million by Nigeria’s Federal Competition and Consumer Protection Commission (FCCPC) for multiple violations of data protection laws related to WhatsApp. The investigation, initiated in May 2021, found that Meta’s privacy policies infringed on users’ rights through unauthorized data sharing and discriminatory practices. Meta plans to appeal the decision, disputing the findings and the imposed penalty. The FCCPC aims to ensure fair treatment of Nigerian users and compliance with local regulations.