Three-D Issue 35: Platform politics, online harms and future research directions
Paul Reilly
University of Sheffield
Phil Ramsey
Ulster University
How can online platforms do better in the fight against ‘fake news’, hate speech and other online harms? What role should researchers play in the regulatory frameworks governing big tech companies like Facebook?
These were among the issues discussed at ‘Platforms and policies: agendas for research and policy action”, a MeCCSA Policy Network event held virtually on 12 January 2021. Organised and chaired by Vice-Chair Jonathan Hardy (UAL), the webinar brought together a group of interdisciplinary researchers and practitioners to discuss how online platforms, policy-makers and researchers should respond to online harms.
The event began with a presentation by Sonia Livingstone (LSE). Drawing on her experience as an expert advisor to the UN Committee on the Rights of the Child, and her work on drafting a General Comment on Child Rights in the Digital Environment, Sonia discussed children’s rights in digital environments. As many as one in three of the world’s internet users are said to be children, so it is a priority that the UNCRC can be applied in radically different states.
Our next speaker was Lina Dencik (Cardiff University), who gave an overview of her research and the work of the Data Justice Lab. She emphasised the structural inequalities exacerbated by online platforms, which have been increasingly politicised in recent years. Lina concluded by assessing the human rights implications of the dataveillance synonymous with big tech companies like Facebook.
Alaphia Zoyab (Reset.tech) gave a practitioner perspective on the importance of reforming the regulation of big tech companies. These platforms turbocharge misinformation and hate speech in countries such as India, illustrating the shortcomings of their current system of regulation. Alaphia argued that there was need for any regulatory system put in place to be auditable and transparent to citizens.
Eleonora Mazzoli (LSE) explored the policy implications of the content curation processes employed by online platforms. This presentation summarised some of the key findings from a recent Council of Europe report Eleonora co-authored with Damian Tambini (LSE). An overview of recent policy developments in countries such as Australia, Germany, and the UK was also provided.
A wide-ranging discussion followed the four keynote presentations. Participants debated whether policy interventions such as the proposed UK Online Harms Bill were likely to provide greater protection for children and young people in digital environments. There was some scepticism about the viability of the UK government’s plans to make Ofcom responsible for regulating online platforms. It was also felt that the ‘platform’ metaphor was strategically deployed by these companies in order to deny their responsibility for harmful content published on their sites: if they were classified as publishers they would be more likely to act quickly to prevent hate speech and misinformation being spread online. There was also a lively discussion about the implications of researchers collaborating with big tech companies in order to get access to data on societal problems such as hate speech and misinformation. These issues are unlikely to be resolved in the short term, but research on them remains imperative.