Britain’s online safety system went into effect on Monday, mandating social media companies such as Meta’s Facebook and ByteDance’s TikTok to take steps to combat criminal conduct and make their platforms safer by design.
Ofcom, the media regulator, said it had produced its first rules of practice for dealing with illegal harms such as child sexual abuse and supporting or promoting suicide. Ofcom has given websites and apps until March 16, 2025, to examine the hazards that illegal content causes to children and adults on their platforms.
Following the deadline, they must begin adopting risk-mitigation measures such as improved moderation, simpler reporting, and built-in safety testing, according to Ofcom. Melanie Dawes, Chief Executive of Ofcom, stated that the safety spotlight is now firmly focused on technology businesses.
“We’ll be watching the marketplace closely to guarantee that firms match up to the rigorous safety norms set for them in our first codes and guidance, with more demands to follow swiftly in the initial six months of next year,” according to her. The Online Safety Act, passed last year, establishes stricter rules for sites such as Facebook, YouTube, and TikTok, with a focus on kid protection and the elimination of illicit information.