The following piece appeared in The Irish Times on Saturday, January 2, 2021:
It was a year few will wish to remember and many will wish to forget. Historians will analyse 2020 long into the future and inevitably offer judgements on how well the crisis was managed.
For many of its future chroniclers, 2020 will actually be seen as a pivotal moment that provided the fuel needed to accelerate a number of changes that were already taking shape. This is particularly true in relation to our habitation of the digital world; over the past nine months, more than ever, digital devices have provided entertainment playgrounds, shopping centres, newsstands, political battlegrounds and windows to the outside for vast populations across the world.
These online spaces have given us a much-needed sense of connection during the grimmest days of lockdown, but how much have we really thought about their seamless integration into our daily lives? If we’re not careful, there is a danger that the aftershocks of our migration online will eventually eclipse the great earthquake that has been Covid-19.
Since the first case of the virus was diagnosed, we have seen a proliferation of untruths spread online about its origins, about supposed remedies for its eradication – some promoted by the US president himself – and now we face a similar barrage of ‘fake news’ in relation to the alleged dangers of vaccination.
From this point of view, a contagion probably more lasting than the coronavirus is rapidly spreading across our world; we are in the midst of a pandemic of misinformation and disinformation and we desperately need to do something about it. If we don’t, and counterfeit truths are allowed to flourish, there will be an inevitable collapse of public trust. Without trust, democracy as we know it will simply decline into irrelevance.
From June 2019 to June 2020, I chaired a House of Lords Select Committee that looked at this very subject; the impact of digital technologies on democracy. Over the course of the evidentiary hearings, our all-party committee came to the conclusion that online platforms are distorting our belief in what we see, hear and read. We have arrived at a situation in which we no longer know who or what to trust.
Indeed, this sense of collective mistrust has filtered into the public discourse beyond the walls of our digital devices, to be seized upon by some politicians in the real world as a form of political weaponry. Recent plans by the UK government to renege on the Brexit Withdrawal Agreement and break international law seem to have employed a lack of trust as a negotiating tactic.
These same politicians have also often used social media platforms to communicate with voters in digital walled gardens, unchecked by the free press or held to account by the opposition.
During my time as chair of that select committee, it became clear that social media companies have the ability to be far more responsible with the amplification of information online, but wilfully refuse to change the way in which their algorithms work because it would be detrimental to their business models.
This is not the first time big business has been resistant to change in the face of evidence that their products are causing harm to their customers.
In the late 1950s, when consumer-advocate Ralph Nader challenged General Motors to take passenger safety more seriously, having first made grotesque attempts to undermine his credibility, the company refused to take action because it thought safety devices would be bad for business. Consequently, security features (like basic seatbelts) in the cars we drive today are there in spite of the largest automobile companies, not because of them.
The early 1990s saw a similar pushback from the tobacco industry when it was challenged about the harmful effects of nicotine – despite overwhelming evidence to the contrary each of the big tobacco companies consistently denied their products were addictive, fraudulently protecting their business interests over those of their consumers.
‘Big Tech’ now faces its own day of reckoning. In recent weeks sweeping new powers have been proposed in Ireland, the UK and the EU for online safety legislation.
In the UK, the long-awaited Online Harms Bill outlines a “legal duty of care” for companies to remove content considered “harmful”. However, the definition of what that ‘harm’ encompasses has been left frustratingly vague and, disappointingly, disinformation and misinformation have so far been omitted. It’s obvious they must be included in any legislation that is serious about harm to citizens.
On 9 December, the Irish Government published its own Online Safety Bill that is similarly narrow in its definition of ‘harms’. However, it will establish a new media commission which may propose a more inclusive definition of “harmful online content”.
In Brussels, in conjunction with a new Digital Markets Act to tackle unfair competition, a Digital Services Act was published on December 16th. It proposes requirements for tech companies to take greater responsibility for illegal behaviour on their platforms.
From Ireland’s point of view, there is a long way to go before the proposals of the EU’s Digital Services Act are signed into law, and questions about whether the rules will be controlled at a national or a federal level remain.
It is proposed that a digital services coordinator be established in each state, but in the case of persistent infringements, the commission could intervene on its own initiative.
As the home to many of ‘Big Tech’s’ European headquarters, Ireland stands in a unique position in Europe. Balancing the economic and employment needs of its citizens with the need to legislate for a trustworthy and civil digital ecosystem will be a challenge for Ireland’s leaders in the years to come.
As with cars and cigarettes, a heavy price will be paid for failing to adequately regulate companies who place their business interests above the safety of citizens, let alone the security of the democracies in which we are lucky enough to live. We must not allow future generations to be the ones to pay that price.