Internet Watch Foundation Annual Report
‘Extreme’ Category A child sexual abuse found online doubles in two years
The Internet Watch Foundation Annual Report 2022, sheds light on the alarming digital and social emergencies taking place #BehindTheScreens, specifically in children's bedrooms. The report reveals a disturbing trend where the number of webpages containing Category A material, which represents the most severe forms of child sexual abuse, has more than doubled since 2020. In 2022, the IWF dedicated analysts detected and removed child sexual abuse imagery from a staggering 51,369 web pages, compared to 25,050 web pages in 2020.
The gravity of the situation becomes even more distressing as it is revealed that the youngest and most vulnerable children endure some of the most horrendous forms of sexual abuse. Infants and toddlers, subjected to acts such as rape and sexual torture, are among the victims of these heinous crimes. As the number of reports concerning child sexual abuse continues to rise, the IWF report serves as an urgent wake-up call for child protection advocates, including policymakers, internet service providers, and parents/carers.
Highlights from the 2022 Report
- In 2022, the IWF assessed 375,230 reports and confirmed 255,570 web pages contained images or videos of children suffering sexual abuse;
- Category A material – the most severe forms of sexual abuse - now accounts for 20 per cent of all the content we see;
- Three in every five (59%) child sexual abuse reports are hosted in an EU member state and demonstrate the ‘desperate’ need for the EU’s proposed legislation to tackle child sexual abuse material;
- Commercial webpages exploiting the sexual abuse of children have more than doubled over the past two years with our analysts identifying 12,900 URLs in 2020 and 28,933 URLs in 2022;
- While the majority of imagery found shows girls (96 per cent or 242,989 instances), there has been a 137 per cent rise in imagery featuring boys (2,641 instances in 2021, compared to 6,253 in 2022).
Report Remove
In June 2021, the IWF (Internet Watch Foundation) and NSPCC (National Society for the Prevention of Cruelty to Children) collaborated to develop an innovative tool called Report Remove. This tool, the first of its kind worldwide, aims to assist young individuals in removing sexual images or videos of themselves from online platforms.
The process involves the NSPCC's Childline service, which ensures that the young person is safeguarded and supported throughout the entire procedure. The IWF evaluates the reported content and takes appropriate action if it meets the criteria for illegality. To prevent the imagery from being uploaded or distributed online, the content is assigned a unique digital fingerprint (hash), which is then shared with internet companies.
This solution prioritises the well-being and needs of the child, offering a user-friendly approach to image removal that can be carried out entirely online. The young person is not required to disclose their identity, can make a report at any time, and can access additional information and support from Childline whenever necessary.
To facilitate the process, young individuals can create or sign into a Childline account, which allows them to receive email updates about their report. The email service can be utilised for ongoing support, and they can also reach out to a Childline counsellor through online chat or the provided freephone number. Additionally, the Childline website offers relevant information, advice, self-help tools, and peer support.
In 2022, the Report Remove tool received a total of 187 reports. Out of these reports, action was taken on 101 of them. These actionable reports were assessed to contain images and/or videos, or URLs linking to such explicit content of child sexual abuse, as defined by UK legislation.
Among the reports assessed as criminal images, the majority consisted of Category C images (69%). Furthermore, it is notable that more boys than girls reported these incidents, with boys accounting for 73% of the total number of reports.
'Self-generated' child sex abuse
In 2022, there was a significant prevalence of "self-generated" child sexual abuse imagery. These distressing images and videos involve children creating and sharing explicit content using smartphones or webcams, which are then circulated online. It is important to note that in some instances, children are coerced, deceived, or blackmailed by individuals who are not physically present with them into producing and sharing such explicit material. Typically, these images are captured within the familiar confines of a child's home, such as their bedroom or bathroom.
Regarding the trends observed, children between the ages of 11 and 13 continue to feature most frequently in "self-generated" imagery, as seen in previous years. Notably, there was a significant rise of 129% in the proportion of this type of imagery involving children aged 7 to 10 in 2022, compared to 2021.
Out of the 255,571 webpages addressed during 2022, a staggering three-quarters (199,363 or 78%) were determined to contain "self-generated" imagery. This marks a 6 percentage point increase compared to 2021 when 72% of addressed reports (equivalent to 182,281) involved remotely-captured content. Consequently, there was a 9% rise in the number of "self-generated" reports from 2021 to 2022, considering the total number of addressed web pages.
Non-photographic child sexual abuse
The IWF (Internet Watch Foundation) is responsible for identifying and removing non-photographic child sexual abuse images and videos hosted in the UK, which are classified as Prohibited Images. In 2022, they took action on 285 reports concerning non-photographic child sexual abuse imagery, indicating a 22% increase compared to 2021. However, upon assessment, none of these reports were confirmed as content hosted in the UK. By the end of 2022, the NPI (Non-Photographic Image) URL List contained 324 unique URLs associated with non-photographic child sexual abuse imagery. This list is subscribed to by 18 IWF Members. The UK is among the few countries where non-photographic child sexual abuse imagery is considered a criminal offence. If the IWF identify such content hosted within the UK, they issue a notice to the hosting provider for its removal. However, there have been no instances of this occurring in the UK since 2016. Nevertheless, non-photographic content of this nature does exist online, and if it were hosted in the UK, it would violate UK laws.