Tech firms which fail to stamp on illegal content could be stung with an £18m fine or 10% of their annual global turnover, says Ofcom.
The government regulator is to be given extra powers to ensure technology companies remove child sex and exploitation content on the internet through an amendment to the Online Safety Bill (OSB).
If Ofcom imposes imposes scanning technologies, experts are expressing concerns around technical feasibility, security implications and privacy.
And, according to industry observers, there are serious misgivings about how or if the regulations can work in practice.
According to the BBC, one of the main concerns is over “end-to-end encryption” (E2EE) messaging and how it might be infiltrated.
Government is behind tools which can detect child abuse imagery within E2EE while respecting the user’s privacy.
Prof Alan Woodward, of the University of Surrey, told the BBC: “If the OSB insists on discovering such material in encrypted data, it can be achieved only by examining the sending and receiving devices – that is, where it is decrypted for consumption.
“The implication is some form of universal ‘client-side scanning’ which many will see as overly intrusive and liable to… be used to detect other items unrelated to child safety.”
‘Client-side scanning’ is a means to scan message contents which match abuse imagery before the message is sent.
Experts say there is no way to make scanning technologies work for “good’ purposes” only.
Prof Woodward added: “The big issue will be that any technology that can be used to look at what is otherwise encrypted could be misused by bad actors to conduct surveillance.”
WhatsApp and Signal is popular with users because only the sender and the recipient of a message can, under E2EE, know its content.
Chief Executive of the Internet Watch Foundation (IWF)Susie Hargreaves has called for provisions in the OSB to enable Ofcom to “co-designate” with the IWF to regulate abuse content online.
She said: “Our unparalleled expertise in this area would make the response strong and effective from day one.
“We have the strong collaborative relationships with industry, law enforcement, as well as the world-leading expertise which can make sure no child is left behind or their suffering left undetected.”
The Government will argue the tech firms will not be able to argue it cannot deploy certain technologies because of it configuration.
Go to ofcom.org.uk
But Ofcom will have the powers to demand a company is using its “best endeavours” to develop or source tools to remove illegal images.
Prof Woodward said: “Ofcom has a steep hill to climb. It will need to attract a lot of rare talent…to come up with the technical solutions demanded by the OSB.
“That’s not to mention the skills they will need to navigate the secondary legislation…It’s a truly huge task ahead of them.”
Ofcom told the BBC: “Tackling child sexual abuse online is central to the new online safety laws – and rightly so. It’s a big and challenging job, but we’ll be ready to put these ground-breaking laws into practice once the OSB is passed.”
The National Crime Agency, which compiles official statistics, estimates there are up to 850,000 people in the UK posing a sexual threat to children.
Access to such content online can lead to offenders normalising their own consumption of this content, sharing methods with each other on how to evade detection, and escalation to committing actual child sexual abuse offences.
Digital Minister Nadine Dorries said tech firms have a responsibility not to provide safe spaces for horrendous images of child abuse to be shared online.
She added: “Nor should they blind themselves to these awful crimes happening on their sites.”