MPs and lords launch a joint parliamentary committee to scrutinize the government’s forthcoming Online Safety Bill. The new committee is already seeking input from the public about their views of the legislation, which the government claims will safeguard freedom of expression online, increase the accountability of tech giants and protect users from harm online. Under the Bill’s statutory “duty of care”, tech companies that host user-generated content or allow people to communicate will be legally obliged to proactively identify, remove and limit the spread of both illegal and legal but harmful content – such as child sexual abuse, terrorism, and suicide material – or they could be fined up to 10% of their turnover by the online harms regulator, now confirmed to be Ofcom.
The joint committee is chaired by MP Damian Collins, the former chair of House of Commons DCMS Select Committee, who previously led an inquiry into disinformation and “fake news” that concluded by calling for an end to the self-regulation of social media firms. “The Online Safety Bill is about finally putting a legal framework around hate speech and harmful content, and ultimately holding the tech giants to account for the role their technology plays in promoting it,” said Collins. “The next step in this process is the detailed scrutiny of the draft Bill. This is a once in a generation piece of legislation that will update our laws for the digital age,” he said.
“We now have a super committee of MPs and peers, highly experienced in this area who will work together to go through this Bill line by line to make sure it’s fit for purpose. Freedom of speech is at the heart of our democracy, but so is fighting against movements thaseekks to harm andehumanizese people. In social media,ge we have not yet got that balance right, and now is the time to fix it. The committee will report its findings to the government on 10 December 2021. It will also seek views specifically on how the draft Bill compares to online safety legislation in other countries.