answer text |
<p>The Government recognises the many benefits Artificial Intelligence can provide
across a range of sectors and our efforts to ensure public safety; however artificial
intelligence also poses significant risks to our efforts to tackle the proliferation
of child sexual abuse material (CSAM).</p><p>This Government remains firmly committed
to tackling all forms of child sexual abuse online and in our communities across the
UK and internationally. Our approach is underpinned by the Tackling Child Sexual Abuse
Strategy which sets out firm commitments to drive action across the whole system.</p><p>The
law in the UK is very clear with regards to production of child sexual abuse material.
It is an offence to produce, store, share or search for any material that contains
or depicts child sexual abuse, regardless of whether the material depicts a ‘real’
child or not. This prohibition also includes pseudo-imagery that may have been computer-generated.</p><p>Furthermore,
the Government is currently driving forward the Online Safety Bill which seeks to
make the UK the safest place in the world to be online. The Online Safety Bill will,
for the first time, place clear legal duties on technology companies to take proactive
steps to identify, remove and prevent users encountering illegal content, including
child sexual abuse content from platforms/services. AI-generated content is itself
capable of amounting to a child sexual abuse offence regardless of whether it depicts
a real child or not. Child sexual abuse offences are priority illegal offences in
the Bill, and therefore tech companies are subject to proactive duties to identify
and remove CSEA content. In addition, Ofcom can require tech companies to use specified
technology to remove such content.</p><p>Companies who fail to fulfil their legal
duties will be held to account by an independent regulator, Ofcom, who will have strong
enforcement powers including fines of up to £18 million or 10 per cent of qualifying
annual global turnover (whichever is greater).</p>
|
|