Author: A.Vipond


Creating Space for Informed Democracy

No Comments

Nicholas Cage replaces Tom Hiddleston in a deepfake of Thor Ragnarok. Image Credit: ABC News

“I have a foreboding of an America in my children’s and grandchildren’s time – when the United States is a service and information economy; when awesome technological powers are in the hands of very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish what feels good and what’s true we slide without noticing, back into superstition and darkness.

We’ve arranged a global civilisation in which the most crucial elements – transportation, communications and all other industries, agriculture, medicine, education, entertainment, protecting the environment; even the key democratic institution of voting profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.”

So wrote the famed scientist Carl Sagan in 1995. Almost a quarter of a century later, here we are in the fallout of his foresight. Around the world, the open information systems we rely on in democracies have been degraded by the pace of scientific and technological acceleration, challenged by globalisation and weaponised to erode the space for informed societal debate.

So, what can Australia do? If the 2016 US elections were the canary in the coalmine that revealed systemic weaknesses in democracy’s information systems, what can be done to repair and renew them?

Democracies’ information systems support spaces for informed debate and disagreement to drive decisions that positively advance democracy, from policy issues and voting in elections, to corruption investigations and the exploration of new governance concepts.

The openness of democracies’ information systems leaves them vulnerable to information attacks which can create feedback loops of self-reinforcing and damaging expectations that undermine rules, institutions and society. The primary motivation of information attacks is to exacerbate pre-existing divisions, sow mistrust and flood the space for informed debate so that it becomes a mechanism for dysfunction and damage to society. Examples of this include the Russian Internet Research Agency’s inflaming of racial tensions, to the Chinese government’s use of fake social media accounts to attack Hong Kong protesters.

As Bruce Schneier and Henry Farrell have opined, this is Democracy’s Dilemma: the open forms of input and exchange that it relies on can be weaponized to inject falsehood and misinformation that erode democratic debate.

We need to map and evaluate these new vulnerabilities to ensure democracy’s core functions are resilient in world that will only become more interconnected with the 4th industrial revolution.

Injecting falsehood and misinformation into democracy is not a new vulnerability. However, the methods used to mount attacks against open information systems has widened. The weaponisation of social media, automation, machine learning, the internet of things and soon quantum computation (which recently may have achieved supremacy) are and will continue to make attacks cheaper, easier to scale and more deniable.

When citizens make political choices in a democracy, they rely on synthesising information from different people and sources to come a decision. Online, that information does not always flow freely and cannot be counted on to be authentic.

If the space for informed debate is compromised or undermined by attack, whether it be a parliament, newspaper, departmental inquiry, court of law or public submissions process, three things occur:

The first is the destabilisation of common political grounds for disagreement. If climate change isn’t real, smoking doesn’t cause cancer and vaccines do not ensure children survive preventable illnesses, factually informed premises for debate are lost. This inhibits the ability to solve these environmental and public health challenges, by inducing false definitional doubt and semantic paralysis.

The second is information attacks which rely on manipulation, exaggeration and disinformation require a more nuanced response, different from warfare’s blunt concepts of deterrence and defense and counter-attack. The resilience and quick restoration of the space for informed debate is far more important. It lessens the damage to other societal decisions affected by the disruption and re-establishes the integrity of flows from which to gather information for a response. This does not rule out counter-attack as an option but in an age when no commonly agreed form of cyber-deterrence exists, the creativity democratic debate allows to find a long-term solution which neutralises attackers should remain paramount.

The third is more subtle. It may be that the structure of the network itself can skew decision-making, corrupting the process leading to a decision. As a recent study from researchers at MIT, University of Pennsylvania and the University of Houston revealed, social media networks can skew bias in collective decision-making by margins of greater than 20 percent. The team conducted 100 online experiments with more than 2,520 human subjects and modelled real world gerrymandering on platforms such as Twitter to test the effects of information gerrymandering. When two parties are of equal sizes and each player has the same amount of influence in the network, a small number of “zealots” in the network can utilise their influence to disproportionately sway collective decision making. The researchers found that social media platforms are particularly vulnerable, because they allow users to block dissenting voices and create filter bubbles, while providing adversarial actors with the anonymity to exploit people through precision tailored messages based on their profiles. This demonstrates that new online platforms may not be suitable for the high-quality informed debate democracy requires.

In addition to these issues there are aspects of this problem which complicate the response.

It is necessary to acknowledge that internal threats are just as dangerous as external ones. In the 2019 Indian general election, the majority of disinformation that invaded public media streams was generated by cyber armies of volunteers supporting the major parties, including the victors, the Bharatiya Janata Party (BJP). These armies spread false stories through popular Indian media such as Whatsapp. In one instance, the “Telangana manifesto” which falsely purported to be a document demanding a Muslim only Congress  was spread as a strategy to exacerbate Hindu-Muslim tensions benefitting the BJP’s Hindu nationalist platform. This reveals how internal checks and balances are just as important as preventing external threats to restrain political parties from engaging in information attacks which undermine their own democracy for political gain.

The second aspect is the complexity of the optimisation problem faced by global social media platforms. When building a piece of hardware like a television, it is possible to design them to each country or region’s safety standards, like the use of different power plugs.

Alex Stamos Tradeoffs

Image credit: Alex Stamos

When designing a global social media platform however, the trade-offs between options become nonlinear and unpredictable at scale. The diagram above shows the tradeoffs of democratic values vs. economic drivers, which social media platforms decide for hundreds of democratic and non-democratic jurisdictions everyday. These decisions currently exist beyond democracies’ capacity and power to decide. However they are not immune to public outcry, as seen after the recent mass shootings in New Zealand which forced Facebook to change its livestreaming rules.

In a world where information attacks are cheap and virality trumps veracity what potential solutions to improve democracy’s resilience can Australia consider?

Including information attacks that compromise democracy in Australia’s cybersecurity policy and legal frameworks is a necessity. Government guidance on measures to prevent, identify, expose and mitigate information attacks requires investment, as does updatable education programs for government and citizens on how to spot and expose information attacks to enhance societal resilience. This is a basic function of maintaining trust in information.

Delineating responsibility is also key. In the last Federal election, the Australian Electoral Commission was tasked with identifying disinformation in the media, despite not being equipped with the capability, budget or enforcement powers to police and investigate all media and social media platforms. Breaches identified came under breaches of electoral advertising and punishments for malicious actors were negligible. Establishing and equipping a specialist independent law enforcement team to intervene and use digital forensics to trace and arrest the offender could raise the cost of undermining democracy significantly. However, defining the boundaries of what constitutes new offences while balancing freedom of speech would require considerable thought and technical understanding by the legal community.

We must also invest in thinking about the policy implications of new technology issues for democracy. From combatting synthetic media such as voice cloning and human image synthesis (so called deepfakes) which can be used to sow mistrust in information attacks, to the conceptual tradeoffs and power imbalances between large global tech companies and democracies; the Australian government needs an independent multidisciplinary unit that can consider the operational and strategic implications of these issues. The United States once had an Office of Technology Assessment which assessed technological advances and translated their implications for policymakers. A similar model which considers all of society effects could be useful.

In order to face significant societal headwinds such as climate change, geopolitical competition and economic reform, Australia needs spaces where its citizens can safely disagree, test solutions and evolve policy through informed, authentic fact-based communication. Acknowledging the limits of online spaces, testing new ones and protecting information flows from attacks designed to undermine democracy will be crucial to the country’s success.