answer text |
<p>Under the Online Safety Bill, all platforms will need to undertake risk assessments
for illegal content and content that is harmful to children. This will ensure they
understand the risks associated with their services, including in relation to their
algorithms. They will then need to put in place proportionate systems and processes
to mitigate these risks.</p><p>Platforms that are likely to be accessed by children
will need to fulfil these duties in relation to harmful content and activity, including
legal self-harm and suicide content. Assisting suicide has also been designated as
a priority offence in the Bill, so all platforms will be required to take proactive
steps to tackle this type of illegal content. The government will also bring forward
a new self-harm offence. Companies will therefore need to remove communications that
intentionally encourage or assist self-harm.</p><p>The largest platforms will also
have a duty to offer all adult users tools to reduce their exposure to certain kinds
of legal content. On 29 November the government announced its intention for these
tools to apply to legal self-harm and suicide content. These tools could include the
option of switching off algorithmically recommended content.</p><p> </p>
|
|