Three-D Issue 35: It’s only a “conflict” between Facebook and regulators if it’s about the data

Declan McDowell-Naylor
Cardiff University

What do you do about a problem like Facebook? This question is set to define a decade of legislative and judicial agendas. And from Australia, we have now seen one approach in action.

Back in 2018, I questioned the intent behind Facebook’s hiring of Nick Clegg and the notion of “building bridges” between governments and Facebook that he used at the time. The key point I made was that Facebook’s thin appeals to “sensible” regulation are premised on the maintenance of the status quo. In other words, Facebook (and other technology companies) will tolerate democratic reform as long as its core business models are intact. Making platform publishers pay for news content is one way in which these business models are affected, hence Facebook’s aggressive move.

We are now entering a phase in which governments and parliaments are more determined to take action. Obviously, we have just seen this in Australia. Canada has vowed to follow suit. There have been various congressional hearings in the past year, with more on the way. The most recent was in November 2020, when Senator Lindsey Graham asked Mark Zuckerbeg and Jack Doresy if they had seen the popular Netflix documentary, The Social Dilemma. Notably, a European Parliament hearing has been “proposed”, though not set, as the European Commission unveiled a range of sweeping regulatory measures.

The vague concern is now that technology companies are simply too powerful. Facebook’s withdrawal from Australia, and the consequences of this action, show this. Attention will now turn to the EU, should Facebook find itself in a similar situation. In this context, Clegg recently published another op-ed, in which he lays out Facebook’s position once again to a European audience. Within European policy making Clegg identifies two key responses to the “understandable concerns about the size and power of tech companies”. The first is regulation and moderation of online content within a democratic framework, which Facebook considers to be the good option, and the second is sweeping prohibitions on the use of data, which they consider to be the bad option.

In terms of content moderation, Facebook does not want to be making unilateral decisions. Both Facebook and Twitter have banned or removed what they consider to be harmful content, but do so within a weakly-defined position prone to dispute, error, and controversy. While technology companies possess many of the means, what’s clearly lacking is, as Clegg rightly points out, a democratic framework. But this is axiomatic; speech laws and enforcement should obviously rest with democratic institutions and the citizens they’re supposed to represent.

In terms of prohibitions on data use, Clegg objects to the ideas proposed by Axel Springer CEO Mathias Döpfner sent to European Commission President Ursula von der Leyen. In this letter, Döpfner states: “Data must once again belong to those to whom it has always belonged. To the citizens”, and calls upon the EU to “prevent the surveillance of our citizens by making it illegal to store all personal, private and sensitive data”. It is important to note that GDPR and two proposed two pieces of EU legislation, the Digital Services Act and Digital Markets Act, aimed at regulating digital markets and holding companies to account are broadly supported by Facebook. What Döpfner proposes is an entirely different social contract between technology companies and the state, but undermining Clegg’s argument here is that even in the face of Australia’s legislation, Facebook has completely withdrawn.

Clegg dismisses Döpfner’s proposals on the basis that they’re based on caricatures of Facebook’s power, and provides some academic evidence to counter. Yet he follows this up by arguing that “the end of the use of personal data to provide goods and services” threatens the “the future of German car manufacturers, Dutch supermarkets, French airlines, and millions of small businesses”. Is there a clearer indication of Facebook’s power than this – the claim that without allowing Facebook to own citizen’s data, whole sections of the economy would collapse? This is the very essence of the problem.

It’s true that tech companies have rooted themselves firmly into Europe’s future economic growth. But the idea that this predicated on Facebook status quo is an unexamined, even false, premise that Clegg wraps up into alarming and intimidating language about economic catastrophe. This echoes how in his 2018 op-ed, Clegg contrasted the reformist Californian ideology Facebook was offering with a damaging tech-lash. It is not a fantasy to suggest that Facebook should have competitors that aren’t Google, or that it hasn’t already bought out.

Through conduits like Clegg, Facebook is presenting itself as open to democratic regulation. Simultaneously, Facebook is presenting the logic that interference with its business practices will result in economic catastrophe. As Clegg’s op-ed suggests, it wants the private data it renders into a business product and the content platform that users can access to be seen as largely separate things. In this way of thinking, democratic frameworks for the content moderation will be seen as Facebook clearing house and the gap between politics and technology as being successfully “bridged”.

The problem, as I wrote before, is that the unregulated markets in which the technology companies arose is the reason that we have “understandable concerns about the size and power of tech companies” in the first place. Recent history is littered with unregulated corporations causing all kinds of damages to democratic society, both directly and indirectly.

What regulators have to realise is that from Facebook’s perspective the real fight isn’t over content regulation, but data ownership and the crushing of any competitors in order to maintain its market position. And for those that care about the consequences of the actual power of technology companies, this is where they need to focus.

To illustrate this point, consider how Google is also preparing to withdraw its search engine from Australia over proposals to make them pay news publishers for content. Consider that both Facebook and Google have recruited five lobbying firms to fight against the legislation. Consider that despite sounding amicable about the EU’s regulatory proposals, the tech companies have constructed one of the largest lobbying operations ever seen in Brussels. And it works. Matt Hancock allowed himself to be intimated by Facebook in private meetings in 2018, when they threatened to withdraw from the UK.

Trust is thin – and for good reason. As a result, there’s a common populist narrative that there is a cultural conflict between tech companies and politicians. This was how Clegg described it in 2018 and the idea is clearly fed by trends such as the Josh Hawley brand of conservative tech-scepticism making its way into the British Conservative Party. Such contributions are unhelpful, because they indulge partisan grievances for the sake of gathering votes, rather than the achievement of democratic outcomes. Facebook’s decision to aggressively withdraw from Australia, and double down, will only fuel this narrative.

Yes, a series of legislative agreements about how to regulate the content that is published on Facebook in a way that conforms with democratic society will almost certainly be reached. Some of the outcomes might even be good. But a pattern of intimidating behaviour is emerging in which tech companies threaten democratic governments with the withdrawal of their activity from certain markets, in order to get their way. This needs to be addressed while the opportunity for regulation is there.

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close