In a significant crackdown on online harassment, Instagram has removed approximately 63,000 accounts in Nigeria that were engaged in sextortion scams, primarily targeting American men, as well as minors. These malicious accounts employed fake profiles to conceal their identities, enticing victims to share explicit images, which were then used as leverage for financial gain. The scammers would demand payment, threatening to release the compromising content to the victim’s friends and family if their demands weren’t met. Tragically, these scams have had devastating consequences, with the FBI reporting at least 20 suicides linked to these incidents.
To combat this growing concern, Meta, Instagram’s parent company, is testing innovative tools to protect users from sextortion. One such feature, currently being rolled out, is an on-device nudity protection alert, which will caution users of potential risks when sending explicit images via Instagram’s direct messaging service. This proactive measure aims to empower users to make informed decisions about sharing sensitive content online. Furthermore, Meta is collaborating closely with law enforcement agencies to investigate and prosecute the alleged crimes, ensuring that those responsible are held accountable for their actions. By taking a multifaceted approach, Instagram hopes to create a safer environment for its users, shielding them from the harmful effects of sextortion scams.