The Online Safety Bill has become law in the UK, designed to force technology firms to take more responsibility for the content on their platforms.
Generally this is seen as very good news, withe NSPCC quoted as saying
‘It marks a new era for children’s safety online at a time when online grooming, child abuse image crimes are at an all-time high.’ and ‘This is a ground-breaking piece of legislation that will radically change the landscape for children online.’
https://www.nspcc.org.uk/about-us/news-opinion/2023/2023-09-19-the-online-safety-bill-has-been-passed-in-a-momentous-day-for-children/
One area which is possibly more contentious, are powers in the Bill to compel digital messaging services to examine the content of encrypted messages, and potentially provide a backdoor. Some providers, such as WhatsApp, Signal and iMessage have already said that this is not possible as their applications are inherently end-to-end encrypted. Some have suggested that they may withdraw from the UK market entirely if they are compelled to comply with this, though the reality may be different.
The tension between online safety and privacy is not new, but this may bring it into more focus. So what does this mean for us? Well, certainly you should have clear policies and procedures for the use of digital messaging platforms, and consider ways to protect your staff and volunteers from harmful content.