answer text |
<p>The government continues to put pressure on companies to respond quickly and effectively
to the threat posed by misinformation and disinformation.</p><p> </p><p>Ministers
and officials hold regular discussions with major social media companies to understand
what is happening on their platforms and the steps that they are taking to address
misinformation and disinformation, including where it is spread by fake accounts.</p><p>
</p><p>We have seen positive steps by platforms to curtail the spread of harmful and
misleading narratives, particularly in relation to COVID-19, although there is clearly
more to do. We will continue to engage platforms regarding measures that could be
put in place to respond to this evolving challenge, and we will put pressure on these
companies to ensure that their policies and enforcement are fit for purpose, whilst
still respecting freedom of expression.</p><p> </p><p>The draft Online Safety Bill
sets out proposals to impose a new duty of care on tech companies to tackle illegal
and harmful content on their services. To fulfil their duty of care, the largest social
media companies will need to set out what harmful content is and is not acceptable
in their terms of service. They will need to enforce these terms of service consistently,
including policies that may relate to fake user accounts. Ofcom will have the power
to hold companies to account if what is appearing on their platforms doesn’t match
up with the promises made to users.</p><p><strong></strong><br><br></p>
|
|