Child sexual exploitation is increasing online, with artificial intelligence generating new forms such as images and videos related to child sexual abuse.
Reports of online child abuse to NCMEC increased by more than 12% from the previous year to over 36.2 million in 2023, as announced in the organization’s annual CyberTipline report. Most reports were related to the distribution of child sexual abuse material (CSAM), including photos and videos. Online criminals are also enticing children to send nude images and videos for financial gain, with increased reports of blackmail and extortion.
NCMEC has reported instances where children and families have been targeted for financial gain through blackmail using AI-generated CSAM.
The center has received 4,700 reports of child sexual exploitation images and videos created by generative AI, although tracking in this category only began in 2023, according to a spokesperson.
NCMEC is alarmed by the growing trend of malicious actors using artificial intelligence to produce deepfaked sexually explicit images and videos based on real children’s photos, stating that it is devastating for the victims and their families.
The group emphasizes that AI-generated child abuse content hinders the identification of actual child victims and is illegal in the United States, where production of such material is a federal crime.
In 2023, CyberTipline received over 35.9 million reports of suspected CSAM incidents, with most uploads originating outside the US. There was also a significant rise in online solicitation reports and exploitation cases involving communication with children for sexual purposes or abduction.
Top platforms for cybertips included Facebook, Instagram, WhatsApp, Google, Snapchat, TikTok, and Twitter.
Out of 1,600 global companies registered for the CyberTip Reporting Program, 245 submitted reports to NCMEC, including US-based internet service providers required by law to report CSAM incidents to CyberTipline.
NCMEC highlights the importance of quality reports, as some automated reports may not be actionable without human involvement, potentially hindering law enforcement in detecting child abuse cases.
NCMEC’s report stresses the need for continued action by Congress and the tech community to address reporting issues.
Source: www.theguardian.com