Linked Data API

Show Search Form

Search Results

1548422
registered interest false more like this
date less than 2022-12-05more like thismore than 2022-12-05
answering body
Department for Digital, Culture, Media and Sport more like this
answering dept id 10 more like this
answering dept short name Digital, Culture, Media and Sport more like this
answering dept sort name Digital, Culture, Media and Sport more like this
hansard heading Social Media: Mental Health more like this
house id 1 more like this
legislature
25259
pref label House of Commons more like this
question text To ask the Secretary of State for Digital, Culture, Media and Sport, what steps her Department is taking to regulate social media algorithms to reduce user exposure to (a) self-harm and (b) suicide-related content. more like this
tabling member constituency Lanark and Hamilton East more like this
tabling member printed
Angela Crawley more like this
uin 102909 more like this
answer
answer
is ministerial correction false more like this
date of answer less than 2022-12-08more like thismore than 2022-12-08
answer text <p>Under the Online Safety Bill, all platforms will need to undertake risk assessments for illegal content and content that is harmful to children. This will ensure they understand the risks associated with their services, including in relation to their algorithms. They will then need to put in place proportionate systems and processes to mitigate these risks.</p><p>Platforms that are likely to be accessed by children will need to fulfil these duties in relation to harmful content and activity, including legal self-harm and suicide content. Assisting suicide has also been designated as a priority offence in the Bill, so all platforms will be required to take proactive steps to tackle this type of illegal content. The government will also bring forward a new self-harm offence. Companies will therefore need to remove communications that intentionally encourage or assist self-harm.</p><p>The largest platforms will also have a duty to offer all adult users tools to reduce their exposure to certain kinds of legal content. On 29 November the government announced its intention for these tools to apply to legal self-harm and suicide content. These tools could include the option of switching off algorithmically recommended content.</p><p> </p>
answering member constituency Sutton and Cheam more like this
answering member printed Paul Scully more like this
question first answered
less than 2022-12-08T17:24:46.533Zmore like thismore than 2022-12-08T17:24:46.533Z
answering member
4414
label Biography information for Paul Scully more like this
tabling member
4469
label Biography information for Angela Crawley more like this