From this week, technology companies operating social platforms in the UK are required to implement stricter measures to restrict illegal content, under a new online safety regime imposed by the regulator Ofcom.
• Social platforms forced to act quickly
Facebook (Meta), TikTok (ByteDance), YouTube (Alphabet) and other similar networks must now apply improved moderation, offer more effective reporting systems and implement security mechanisms to detect and prevent illegal activity. "Platforms must act quickly to comply with the new regulations, and our codes are designed to help them in this process," said Suzanne Cater, chief executive of Ofcom.
• Harsh penalties for companies that fail to comply
The Online Safety Act, which comes into force in 2023, imposes higher standards on content shared on social networks. Ofcom has set a deadline of March 16 for companies to assess the risks that illegal content poses to users. Companies that fail to comply with this law can be fined up to £18 million or 10% of annual turnover.
• Additional checks on file-sharing services
In addition, Ofcom has launched a separate programme to check the security measures adopted by file-sharing and file-storage services, which are considered vulnerable to the distribution of illegal content. Providers of these services must submit their risk assessments by March 31, otherwise they may be fined. Through these measures, the British authorities aim to create a safer digital environment and protect users from harmful content on the internet.
Reader's Opinion