The Online Safety Bill
Marilyn Hawes examines the implications of the new Online Safety Bill
After 5 years of wrangling, mostly about freedom of speech, the online safety bill has finally been signed off by The King.
Many businesses and media who need freedom of speech have fought every inch of the way. However, whilst seeing their point, is this the same as safety for children? The issue here is the Internet will NEVER be safe as it wasn’t created as a platform suitable for youngsters.
The Essence of the Bill
- Online companies such as Meta, Wikipedia etc will be accountable and responsible for user safety;
- Social media must remove illegal content;
- They must give age limits and carry out age checks and risk assessments;
- Self-harm images and content MUST be removed;
- Social media is now legally responsible for removing what may be legal BUT is nonetheless harmful content, such as violence, suicide, and pornography;
- They must remove deep fake images, including pornography created as deep fake;
- Due to the removal of end-to-end encryption, WhatsApp has said it may be unavailable in UK;
- OFCOM can force the release of private messages and will regulate the law; and
- OFCOM will oversee setting standards, and there will be fines of 10% of the income of tech companies which could be up to £18 million AND executives of these companies being imprisoned if companies fail to comply.
OFCOM will set the standards and be phased in, once a set is published.
- First set – CSE, fraud, terrorism; and
- Second set – Child safety duties, pornography, transparency, and user empowerment
It may not be perfect but it is an excellent start. It is considered more still needs to be done to prevent harm to children, but amendments and additions will be made accordingly.
Freedom from Abuse Recommends that Parents:
- Cover webcams;
- Put in place Parental locks. (free to download via Google);
- Monitor your child’s use of phones and iPads; and
- TALK to your children explaining reasons for blocking certain apps and games.
What the Act means for Children and Professionals
The online world plays a huge role in the lives of children and young people. Social media, online gaming, instant messaging platforms and image-sharing services enable children to interact with their peers, develop and pursue interests, and connect with new communities. However, these platforms and services also come with risks, including online abuse, grooming, and exposure to content that is illegal or harmful.
The Online Safety Act 2023 sets out to minimise these risks, placing new legal duties and responsibilities on online service providers to keep children and young people safe online.
How Will the Online Safety Act Affect Professionals Working With Children?
The Act places the onus on tech companies to keep children safe on their services and platforms. Although the Act won’t affect your duties as a professional, it’s important to be aware of changes that may impact your professional practice.
Social media companies will have to provide adults and children with clear, accessible and easy-to-use ways to report problems and make complaints online if harms arise. So if you think a site is falling short of the required standards, it should be easy to raise your concerns with the platform.
If you have ongoing concerns about a platform, you can make a complaint to Ofcom. While Ofcom cannot respond to individual complaints, this information can help them to assess which services are complying with the regulation.1
The Act also introduces new criminal offences, including:
- An intimate image abuse offence, which makes it a crime to share an intimate image of someone without their consent; and
- A ‘cyberflashing’ offence, which criminalises sending an explicit image for the purpose of sexual gratification or to cause the recipient humiliation, alarm or distress.
It’s important that you are aware of these new offences, and that you know what steps to take if you need to support a young person who has had an image shared without their consent, or who has received or sent an explicit image.
Tackling Illegal and Harmful Content
Companies will now need to prevent, detect and remove illegal content. This includes content depicting, promoting or facilitating:
- Child sexual abuse;
- Controlling or coercive behaviour;
- Terrorism;and
- Suicide.
Companies must prevent children from accessing content that is harmful or age-inappropriate. This includes content depicting, promoting or facilitating:
- Pornography;
- Serious violence;
- Bullying;
- Self-harm; and
- Eating disorders.
Regular Risk Assessments
Companies must assess the risks and dangers that their platforms pose to the safety of children. If risks are identified, companies are required to act by putting mitigations in place.
Larger companies will also need to publish a summary of their risk assessments, promoting increased transparency around the risks that online platforms and services pose to children.
Enforcing Age Limits
If harmful or age-inappropriate content is present on a platform, companies must use age verification or age estimation tools to prevent children from encountering this type of content.
Companies will have to declare which age assurance tools they are using, if any, and show that they are enforcing their age limits.
How Will the Online Safety Act be Enforced?
Ofcom will be working with tech companies to make sure they are protecting their users and following the requirements set out in the Act. Their draft guidance and codes of practice are currently under consultation and will come into force once approved by parliament.
If companies fail to comply with the new rules, Ofcom have powers to enforce:
- Fines of up to £18 million, or 10% of the company’s annual global turnover, whichever is greater;
- Criminal action against companies and/or senior managers who fail to comply with requirements or fail to follow requests from Ofcom; and
- Business disruption measures, including preventing companies from being accessed or generating income in the UK.
How Will the Online Safety Act Keep Children Safe?
The Act means that tech companies running social networking sites or search engines must promote online safety by tackling illegal material and content that is harmful to children, conducting regular risk assessments, and properly enforcing age limits.
To make sure companies meet these requirements, the government has placed the independent regulator OFCOM in charge of enforcing the regulatory framework and raising awareness around online safety.
Marilyn Hawes
Marilyn is the Founder of the charity Freedom From Abuse which provides support and resources to educate users on how to identify an abuser, report abuse and protect children in their care. A survivor of abuse herself, she was named Inspirational Woman of the Year in 2017.