NZLII Home | Databases | WorldLII | Search | Feedback

New Zealand Law Foundation Research Reports

You are here:  NZLII >> Databases >> New Zealand Law Foundation Research Reports >> 2019 >> [2019] NZLFRRp 3

Database Search | Name Search | Recent Articles | Noteup | LawCite | Download | Help

Elliot, Marianne; Berentson-Shaw, Jess; Kueln, Kathleen; Salter, Leon; Brownlie, Ella --- "Digital Threats to Democracy" [2019] NZLFRRp 3

Last Updated: 25 March 2021


2 0 1 9

A R E P O R T B Y T H E W O R K S H O P

M A Y

>

> >

> >

> >

> >

> >

> >

> >

> >

>

> > >

>

>

>

>

>

>

> Y

>

>

>

>
>

>

>

>

>

>

> C
>

>

>

>

>

>

>

> L

>

> A

>

>

>

>

>

>

>

> A S
> R

>

>

>

>

>

>

>

> T T

> C

>

>

>

>
>

>

>

>
>

>

>

>

> >

> >

> >

> >

G I

E A

O >

M O

>

>

>

> I R T E

>

>

>

>

>

>

>

> D H

> D

>

>

>

>
>

>

>

>

> T
>

>

>

>

>

>

2019_300.jpg

2019_301.jpg

2019_302.jpg

>

>

>

>

>

>

>

>

>

>

>

>

> > > > > >

2019_303.jpg

> > >

2019_304.jpg

2019_305.jpg

2019_306.jpg

2019_307.jpg

2019_308.jpg

2019_309.jpg

2019_310.jpg

2019_311.jpg

D I
G I T
A
L

T
H R E
A
T
S

T O



D E
M O C
R
A
C Y

This report is part of the Digital Threats to Democracy research project.

To see the rest of the reports and the overall findings go to digitaldemocracy.nz

ISBN: 978-0-473-48026-4

CONTENTS ACKNOWLEDGEMENTS 05

INTRODUCTION 06
EXECUTIVE SUMMARY 10
PURPOSE, DEFINITIONS AND METHOD 29

Research purpose 29

Definitions 29

Methodology 30

Overview 32

THE OPPORTUNITIES OF DIGITAL MEDIA 35

Democratization of information publishing 36

Broadening of the public sphere 36

Increasing equality of access to and participation within political processes 36

Increasing engagement in political processes 36

Increasing transparency and accountability from government 36

Promotion of democratic values 37

THE THREATS 38

Increasing power of private platforms 39

Foreign government interference in democratic processes 39

Surveillance and data protection 40

Fake news, misinformation and disinformation 40

Filter bubbles and echo chambers 41

Hate speech and trolling 41

Distrust/dissatisfaction with democracy 42

THE SOLUTIONS 43

A Hierarchy for Solutions and Interventions 43

Change society wide structural & systems issues 46

Create supportive environments & contexts - making the default inclusive and safe 47

Create long lasting protections for people or intervene to protect them 49

Build understanding and change individual behaviours 50

WHAT ROLE FOR NZ? 51

New Zealand as follower 52

New Zealand as leader 52

New Zealand as niche influencer 52

CONCLUSIONS 53

Inclusive and transparent processes are critical 54

More research needed 54

Evidence-led and principled approach to urgent policy development 54

Apply human rights principles 55

Agile and responsive approach to policy 55

Urgent areas for action 56

SUMMARY OF KEY FINDINGS FROM EACH PART OF THE RESEARCH 57

Key findings from the survey 57

Key findings from the literature review 58

Key findings from the interviews 59

CONTENTS

2019_312.jpg

APPENDIX 1: LITERATURE REVIEW PART 1 65
APPENDIX 2: LITERATURE REVIEW PART 2 93
APPENDIX 3: REPORT ON QUANTITATIVE SURVEY 123
APPENDIX 4: REPORT ON QUALITATIVE INTERVIEWS 143
APPENDIX 5: ONLINE HATE AND OFFLINE HARM REPORT 227

DIGIT AL THRE A TS T O DE MO CR A CY



A C K N O W L E D G E M E N T S

This research was made possible by support from our major funder, the New Zealand Law Foundation’s Information Law & Policy Project, with additional research funding from the Luminate Group.

The New Zealand Law Foundation - Te Manatū a Ture o Aotearoa – provides grants for legal research, public education on legal matters and legal training. The Information Law & Policy Project (ILAPP) funds projects that will better prepare New Zealand for the challenges of the information age. The project is intended to support the growth, understanding and resilience of New Zealand and prepare the country for future digital competence. It will support and feed into work the public and private sector is undertaking, but will remain independent.

The Luminate Group is a global philanthropic organisation with the goal of empowering people and institutions to work together to build just and fair societies. They fund and support projects that will help people participate in and shape the issues affecting

their lives, and make those in power more transparent, responsive, and accountable. Their focus is on civic empowerment, data & digital Rights, financial transparency, and independent media.

The research team on this project was lead by Marianne Elliott (The Workshop) and included Dr Jess Berentson-Shaw (The Workshop), Dr Kathleen Kuehn (Victoria University of Wellington), Dr Leon Salter (Massey University) and Ella Brownlie (The Workshop).

Project management was provided by Jay Brooker (The Workshop). The quantitative survey was conducted by UMR Market Research.

We particularly want to thank all the participants in the interviews for this research. They volunteered their time on this project, brought both goodwill and clear thinking to

the unreasonably broad scope of the research, and responded rapidly to our requests for feedback. This kind of generosity and rigour across the sector gives us hope that with more investment in research and a more joined-up government approach to the

subject, New Zealand will be able to contribute substantially to an urgently needed global collaboration on solutions.


I N T R O D U C T I O N

In February this year, as I pored over the findings of our literature review and read through hundreds of pages of interview transcripts, I wrote that an adequate response to the problem of online hate, harassment and abuse was possible. It would require a recalibration of our policy approach, some international diplomacy and cooperation, and a sufficiently diverse group of decision-makers at the helm.

I believed then that all of that was within the capacity of the New Zealand government, and that there was “likely to be a leadership role for our country in global efforts to combat online abuse and, as Sir Tim Berners-Lee has put it, ‘fight for the web’.”

This belief has proven to be founded, although under circumstances none of us ever wanted to witness. As we completed this research, it was announced that New Zealand’s Prime Minister, Jacinda Ardern, would meet the French President, Emmanuel Macron in Paris to “bring together countries and tech companies in an attempt to stop social media being used to promote terrorism.” The meeting will invite world leaders and tech company CEOs to sign a pledge called the ‘Christchurch Call’.

In many ways, the devastation of the Christchurch mosque massacres has proven to be a turning point for New Zealand on this, and other policy issues. We now know that our small size and relative remoteness do not render us immune to the terrible harm that can be done by a person motivated by hatred, inspired by the internet and armed with a semi-automatic weapon.

In the wake of the March 15 attacks, in response not only to the unthinkably cruel and manipulative use the terrorist made of the internet in the course of the attack but also to the many ways in which online spaces have allowed hatred to grow and spread, many people - including our Prime Minister - called for greater accountability and care from the big digital platform companies, including Facebook and Google.

DIGIT AL THRE A TS T O DE MO CR A CY

It’s a call some of us have been urging our government to make for some time now, and many feel it is long overdue. But here we are now, and this is a crucial moment in the history of the relationship between citizens, governments around the world and a handful of people who not only control a significant portion of the means by which we all communicate and the distribution of news and information to vast percentages of the world’s population, but also hold huge quantities of personal data about us all.

The question is no longer whether something needs to change. The question has become: what precisely needs to change? And even more importantly: what can be done? What evidence do we have as to the interventions and solutions that might mitigate against the biggest threats posed to our democracy by digital media, without losing the best of the opportunities that the internet offers? Those are the questions we set about answering with this research.

We are far from the first people to tackle these questions, as our literature review reveals. Researchers, academics, journalists and former employees of the big tech companies have been studying and writing about the impact of digital media on democracy in increasing numbers over recent years.

In his book, The People vs Tech Jamie Bartlett predicted: “In the coming few years, either tech will destroy democracy and the social order as we know it, or politics will stamp its authority over the digital world.” In his view, “technology is currently winning this battle, crushing a diminished and enfeebled opponent.”

Similarly, in How Democracy Ends, David Runciman assessed the comparative strengths of the tech giants versus governments, in a ‘Leviathan vs Leviathan’ showdown for the future of democracy. Although he gave governments more of a shot than Bartlett had, he concluded that while “Facebook will not take down the Leviathan in mortal combat ... it could weaken the forces that keep modern democracy intact.”

But neither Runciman nor Bartlett, nor any of the analysis I’ve read over the past year, predicted the situation we are now in. None imagined a Prime Minister with a global reputation for compassion, armed with moral courage, clarity and the support of an outraged nation.

Has Jacinda Ardern become the global leader capable of taming the tech giants? There are good reasons to hope so, and even more reasons to ensure that this rare opportunity is neither wasted nor lost.


NEED FOR A COHE SIVE , EVIDENCE-B ASED APPR O A CH T O P OLIC Y

One of the challenges of rapidly developing policies on digital media in response to a situation like the Christchurch attacks is that this entire area of policy has been relatively neglected until recently. As one interviewee in this research said, we need a better system for making policy on these issues before we can be any kind of global leader. “Smart people just basically giving their opinion with no real information behind it,” won’t be good enough to develop the kind of solutions demanded by this particular set of problems and, they say “that’s how we’ve made our policy in this space, generally.”

Until very recently, there was no centralised or coordinated government process for developing policies and strategies in response to the challenges posed by digital media. Responsibility fell to a wide range of different agencies and teams, and policy development was consequently, inevitably, fragmented. In the process of doing this project, we found

it difficult to establish who in government, if anyone, had a broad view over the full range of issues raised in our research. Recently, new efforts at coordination have begun to appear, with some degree of overarching responsibility, although not necessarily with the resources needed to develop policy across such a wide-ranging and rapidly changing area.

In the past, according to one interviewee, New Zealand has either simply adopted the policy approach taken in another jurisdiction “or we have a relatively flimsy policy

discussion which isn’t founded in evidence.” In order to build our capacity as a country to understand and deal with these issues, they argue, we need more of an evidence base. “Before we can be leaders in any sense, we need to be equipped to have a solid base for developing policy ourselves.”

DIGIT AL THRE A TS T O DE MO CR A CY

What our research shows is that it is critical that the Prime Minister and her advisors look beyond immediate concerns about extremism and content moderation, and ensure that our government’s efforts in this moment take into account the wider structural issues that created the conditions in which a live video of an act of such violence could be shared and viewed so widely.

Those wider structural issues include in particular the impact of platform monopolies, in which a handful of people have the power to determine the social interactions and access to information of millions of people, algorithmic opacity, in which algorithms have

ever-increasing influence over what we hear and see without appropriate transparency or accountability, and the attention economy, which gives priority to content that grabs attention, without sufficient regard to potential harm.

Our intention is that this research will contribute to a wider consideration of the issues arising from digital media’s impact on democracy, and to the development of a body of evidence which supports this critical work.

Marianne Elliott

Lead Researcher, Digital Media and Democracy Co-Director, The Workshop



D I G I T A L T H R E A T S T O D E M O C R A C Y

> >

> > >

> > > > > > > > > >

> > > > > > > > > > > > >

> > > > > > > > > > > > >

> > > > > > > > > > > > >

> > > > > > > > > > > > >

> > E X E C U T I V E > >

> > S U M M A R Y > > >

> > > > > > > > > > > > >

> > > > > > > > > > > > >

> > > > > > > > > > > > >

> > > > > > > > > > > > >

> > > >

> > > > >


> > Le on S alter, Kathle en Kuehn, Je ss B erentson-Shaw & M arianne Elliott

2019_313.jpg

2019_314.jpg

2019_315.jpg



D I
G I T
A
L

T
H R E
A
T
S

T O



D E
M O C
R
A
C Y

This report is part of the Digital Threats to Democracy research project.

To see the rest of the reports and the overall findings go to digitaldemocracy.nz

ISBN: 978-0-473-48026-4


I N T R O D U C T I O N

As we completed this report it was announced New Zealand’s Prime Minister, Jacinda Ardern, would meet the French President, Emmanuel Macron in Paris to “bring together countries and tech companies in an attempt to stop social media being used to promote terrorism.” The meeting will invite world leaders and tech company CEOs to sign a pledge called the ‘Christchurch Call’.

The question is no longer whether something needs to change. The question has become: what precisely needs to change? And even more importantly: what can be done? What evidence do we have as to the interventions and solutions that might mitigate against the biggest threats posed to our democracy by digital media, without losing the best of the opportunities that the internet offers. Those are the questions we set about answering with this research.

One of the challenges of rapidly developing a policy response on digital media in response to a situation like the Christchurch attacks is that this entire area of policy has been relatively neglected until recently. As one participant in this research said, we need a better system for making policy on these issues before we can be any kind of global leader. In order to build our capacity as a country to understand and deal with these issues, we need a better evidence base.

DIGIT AL THRE A TS T O DE MO CR A CY

What our research shows is that it is critical that the Prime Minister and her advisors look beyond immediate concerns about violent extremism and content moderation, to consider the wider context in which digital media is having a growing, and increasingly negative, impact on our democracy.

B A CK GR OUND Over recent years a growing body of international research has looked at the impact

of digital media on democracy, with particular focus on the US and the UK, where the role played by digital media in the election of Trump and the Brexit referendum raised significant concerns.

Our project was designed to find out if we should be worried about these same issues here in New Zealand, and if so, what should we do about it? In order to answer that question we identified five key features of democracy against which we could measure the impact of digital media, for better and for worse. They are:


> Electoral process and pluralism

> Active, informed citizens

> Shared democratic culture

> Civil liberties and competitive economy

> Trust in authority

WHA T WE’ VE FOUND

Critically, we found that digital media is having an impact across every one of those features of a healthy democracy.

There are indicators that digital media has had some beneficial impacts. Our quantitative research here in New Zealand indicates, for example, that people from minority groups have been able to use digital media to participate in democratic processes including accessing political players, and engaging in public debate. Whatever our response to the challenges posed to democracy by digital media, it’s important we don’t lose these opportunities in the process.

But the overall trend should raise serious concerns. Active citizenship is being undermined in a variety of ways. Online abuse, harassment and hate - particularly of women, people

of colour, queer people, people with disabilities and people from minority religions

- undermines democratic participation not only online, but offline. Misinformation, disinformation and mal-information are undermining not only informed debate, but also public trust in all forms of information. Distraction and information overload are eroding citizens’ capacity to focus on important and complex issues, and their capacity to make the ‘important moral judgements’ required in a healthy democracy.

Likewise, interviewees described a myriad of ways in which our shared democratic culture is being undermined by digital media - including through disinformation, polarisation, attention hijacking and radicalisation.

One of the clearest impacts of digital media on our democracy has been its impact on funding for mainstream media. While Facebook and Google hoover up the advertising revenue that once would have been spent on print, radio and television advertising, they contribute nothing to the work of producing the kind of news and current affairs reporting that is essential to a functioning democracy.

The representative survey we carried out indicates that New Zealand’s small size and relatively healthy mainstream media (relative to elsewhere and despite significant resource

challenges) may help us avoid the worst effects of ‘filter bubbles’ and ‘echo chambers’ in digital media on some issues.

Interviewees in our qualitative research nonetheless pointed to examples where debate in New Zealand about issues like free speech, hate speech and gender identity attracted the attention of foreign actors holding strong, even extreme, views on these issues.

Engagement by these foreign actors in the online public debates on these issues here in New Zealand appears to have contributed to a polarisation of views here.

THE THREE CORE PR OBLEMS T O

EMER GE FR O M OUR RE SEAR CH

2019_316.jpg

At the heart of the challenges to democracy posed by digital media are three core problems:


  1. Platform monopolies: two or three corporations control not only our means of communication, but also the content which is distributed, both of which are core aspects of our democracy. Whilst the market power and global mobility of these companies make it possible for them to avoid national regulatory measures, either by moving operations elsewhere or simply ignoring them;
  2. Algorithmic opacity: algorithmic engines are using huge quantities of personal data to make ever more precise predictions about what we want to see and hear, and having ever increasing influence over what we think and do, with little transparency about how they work or accountability for their impact; and
  3. Attention economy: the dominant business model of digital media prioritises the amplification of whatever content is best at grabbing our attention, while avoiding responsibility for the impact that content has on our collective wellbeing and our democracy. The negative impact is brutally clear from both the literature and the world around us.

2019_317.jpgNEED FOR A

DIGIT AL THRE A TS T O DE MO CR A CY

S Y S TEMIC RE SP ONSE

2019_318.jpg

The key message is clear; digital media is having massive, system-wide impacts on our democracy. It affects every part of our lives and the people who run the corporations controlling the major platforms are having a determinative impact on the very structures and functions of our society. While better content moderation is clearly one of the responses we must demand of the platforms, it is not even close to being a sufficient response to the scale of the challenge.

It’s critical that this moment of global cooperation is used to address the wider, structural drivers of the biggest threats posed to democracy by digital media. These structural drivers include the power that a handful of privately-owned platforms wield over so many aspects of our lives, from what information we see, who we interact with, and who can access information about us. And we must do this while maintaining and building upon the many opportunities digital media simultaneously offer, to tackle some of the biggest challenges facing democracy, including inequity of access and declining engagement.


T H E

O P P O R T U N I T I E S O F D I G I T A L

M E D I A

The potential of digital democracy lies in its ability to increase democratic participation, embrace diversity of opinion, and empower marginalised groups. We identified six clear opportunities from the literature that digital media offers. These are: the democratisation of information publishing, broadening the public sphere, increasing equality of access

to and participation within political processes, increasing participation and engagement in political processes, increasing transparency and accountability from government and promotion of democratic values.

Broadly speaking these opportunities fit into two categories: First, those that enable individuals, citizens, or groups, who due to their status in society have been excluded from fully participating in different aspects of the democratic process, through greater access to the levers of democracy. Examples include the use of digital media to: broaden the public’s engagement with indigenous people and their lives, give more exposure to women in politics, build well-networked, educated and empowered communities, and encourage political engagement from youth.

The second category of opportunities relate to digital media’s use by people in governments to make processes of democracy more inclusive, to increase engagement with citizens, improve transparency of government work, and rebuild trust in democratic processes. Examples of such work include: online deliberative democracy processes, open or e-government initiatives, and funding of public service journalism, platformed on digital media.

The opportunities for digital media are significant and important. If used well, digital media can enable governments to respond effectively to the experiences of marginalised groups, to ensure equitable policies and practices are designed, delivered and adjusted, and to build trust in the democratic institution as responding to the needs of all people. It offers as much to people pushing against barriers to their progression, inclusion, and improved wellbeing in society, as it does to people in government looking to remove those barriers and build a more inclusive democratic system.


T H E

T H R E A T S

The threats to this promise outlined in the literature are significant however, and most are intricately bound up with the concentration of power in profit driven companies. The seven key threats we identified to inclusive democracy from digital media were: the increasing power of private platforms, foreign government interference in democratic processes, surveillance and data protection issues, fake news, misinformation and

disinformation, filter bubbles and echo chambers, hate speech and trolling, and distrust/ dissatisfaction with democracy.

Researchers highlighted the increasing dominance of an increasingly small number of privately-owned platforms over the internet. People who own and control these platforms have a monopolisation tendency linked to the relationship between the mining of user- data and their imperative to make profits. This model of operation is termed “platform monopolies”. The monopolisation tendency makes alternatives to the data-extraction for profit model, for example, co-operative, democratised ownership models, hard to start up and survive.

The concentrated power of these platforms shapes not just the wider information context and ability to develop alternative non-extractive models of digital information provision and sharing, but individual’s personal lives also. Platform monopolies affects how we interact socially and with whom through algorithms. A body of literature points to how the actions of the people running these companies impact human rights, both through the control of personal data and the level of control over what appears in the public sphere.

DIGIT AL THRE A TS T O DE MO CR A CY

From this model of platform monopolies flows a series of further threats to democracy. Some relate to the features of the platforms, directly linked to the capture of people’s personal data. The collection and on-sale of personal data by these platforms, to both governments looking to undertake surveillance on their own citizens, and private organisations wishing to make profits, erodes trust in information systems by the public, and curtails the professional work of media and writers - a key plank in our democracy.

The creation of the “attention economy” also poses a significant threat. People’s propensity to attend to shocking, false, or emotive information, especially political information, is exploited and used as a commodity product by digital media platforms. The literature shows that governments with the means and inclination to manipulate information can tailor false information towards individuals with the express intent of interfering in other countries democratic processes, for example the Russian government interference in the US election of 2016 using ‘bots’ and disinformation campaigns.

While misinformation & disinformation, especially political disinformation, targeted at

individuals on digital media, is used to influence politics with a big and small p, from national elections through to information provision and sharing with regard to political issues and policy more generally.

Filter bubbles are a specific technical effect of this attention economy. Facebook’s news feed is a filter bubble, created by a machine-learning algorithm which draws on data created by user networks, likes and comments, and how much organisations are willing to pay to be present there. Filter bubbles and the related echo chambers they feed into (in which people attend only to information which confirms what they already believe) are linked to a decline in trust in the ability to traditional news media to provide reliable information, have been found to exacerbate political divisions and polarisation, and has negative implications for the mechanisms of liberal democracy, as developing a broad consensus around decisions made in the public good becomes increasingly difficult.

The rise of hate speech and trolling is linked to the polarising effects of filter bubbles and echo chambers. Both racialised and sexualised hate speech is a specific threat exacerbated by the anonymity provided by digital media. Hate speech and trolling on digital media encourages affected groups to retreat to safe locations, rather than engaging with national debates and institutions. Research has found a correlation between strong, vocal disagreements with an individual’s perspectives and a “spiral of silence” which acts to curtail the voicing of contentious opinions by minority groups. The particular ability of trolls and hate speech to fan antagonistic ‘flames’ rather than promote rational debates, has a direct impact on democratic participation.

While people’s distrust with democractic process is a longer term issue, digital media has likely exacerbated this pattern across western democracies. Researchers argue that trust, informed dialogue, mutual consent, and participation - fundamental features of democracy - are being eroded by the features that make social media so profitable.

Researchers also found that the way in which the information is distributed on digital media (horizontal, and decentralised and interactive) increases intolerance of others, polarisation and skepticism toward democracy.

The opportunities of digital media, while still apparent, appear to have been suppressed by the sheer weight of fake news, filter bubbles, populism, polarisation, hate speech, trolls, and bots, that have emerged from the concentration of power in a small group of private organisations seeking to maximise profits. Digital platforms initially celebrated for their democratic possibilities, have transformed into anti-democratic power centres through the collection and exploitation of users attention and data. These privately owned platforms have largely escaped public oversight or regulation over their ability to harness this new power for commercial or political gain.

The question is what can policymakers do to recalibrate? Are there empirically tested public policies and approaches that can ensure digital media works to strengthen and deepen democracy?


T H E

S O L U T I O N S

While our literature review was not exhaustive, in general we found a dearth of empirically tested solutions. Likewise, and possibly because of the lack of evidence in the literature, the experts interviewed for this research generally had more to say about the risks and threats they saw arising from digital media, than they did on potential workable solutions. However, we did find some common ground on solutions

between the literature and in the interviews. Below we discuss potential solutions for each of the identified threats, with a focus on optimising the opportunities. We have organised those solutions in a hierarchy, from those we think will have the greatest impact, with the least effort required by individual citizens, through to those with the least impact and most individual citizen effort. That is not to say that the politics of implementing those solutions with the greatest impact are not difficult, but that the political effort required is justified by the potential for positive impact.

DIGIT AL THRE A TS T O DE MO CR A CY

CHANGE SO CIET Y WIDE S TR UCT UR AL & S Y S TEMS ISSUE S

2019_319.jpg

This section focuses on structural and systemic change, addressing for example the disproportionate power of the tech giants vis-a-vis governments, citizens and their domestic competitors.


REDUCE THE P O WER OF PRIV A TE PL A TFORMS B Y :

Regulating platforms like other industries. Currently, regulatory debates largely centre around defining the structure, terms and conditions of what kind of industry private intermediaries represent. How platforms should be regulated or governed thus partly hinges on how these services are defined; for example, whether social media platforms are media companies, public spaces, utilities or some other service, largely informs how they can ultimately be governed. There is little or no empirical evidence to show how regulation in this area would or would not work, and therefore adaptive approaches

to policy and regulation will be needed. This will involve ensuring that the impacts of

any change are regularly monitored and changes made as needed in response to those findings.

Introducing new modes of collective action. Under industrial capitalism we had collective bargaining, the strike, for example forms of collective action that were sanctioned by

law and had the support of society that allowed people to tame capitalism with legal protection. In relation to digital media researchers suggest there are opportunities for more collective action both by tech workers, demanding for example more ethical design in the products they work on, and by digital media users.


CO MB A T F AKE NEWS B Y :

Supporting a vibrant and diverse media sphere. One that balances strong, independent and adequately resourced public service media with a non-concentrated commercial media sector. This is proposed but untested idea.

CREA TE SUPP OR TIVE ENVIR ONMENTS &

CONTEXTS - MAKING THE DEF A UL T

INCL USIVE AND S AFE

2019_320.jpg

REDUCE THE P O WER OF PRIV A TE PL A TFORMS B Y :

Designing new competitive digital media solutions. Disruptive technology is needed to forge an alternative digital future that in turn, facilitates a more democratic internet. This means the creation of platforms offering a different set of affordances (ie not those driven by platform monopolies).


REDUCE INTERFERENCE FR O M FOREIGN GO VERNMENTS AND P O WERS B Y :

Designing new cybersecurity infrastructure and drawing on “big datasets’ to review and assess electoral policies. The research in this area is also largely normative, and seems to generally prescribe such infrastructure and reviews will reduce threats to elections and other political processes.

ADDRE SS SUR VEILL ANCE & IMPR O VE D A T A PR O TECTION B Y :

Regulating companies’ information management practices. Some regulatory measures, like the Singaporean Data Protection Act 2012, work to and have been proven effective in bringing formal charges to data mismanagement and abuse.

Making regulatory changes to data privacy policies. However, there is little evidence to suggest that these changes will reduce surveillance/data collection so much as regulate how that data is stored, accessed and used by data collectors and other third parties

REDUCE THE P O WER OF PRIV A TE PL A TFORMS B Y :

Building citizen-consumer activism and creating a “sea change in public opinion”. Scholars and theorists suggest that a shift in public attitudes is needed to persuade digital media companies to change, there is however no empirical data to draw upon as to how effective this approach would or would not be.


CO MB A T F AKE NEWS B Y :

Developing and circulating persuasive counter-narratives. The focus would need to be on emotional not rational, appeal. This is proposed but unmeasured.

O VER CO ME FIL TER BUBBLE S/ECHO CHAMBERS, AND SILENCING EFFECTS OF HA TE SPEECH B Y ::

Supporting new platform designs with different design affordances.

The design of platform affordance has an impact on inclusion and participation, as well as the types of interactions people experience and information they are exposed to. There is some suggestion that design affordances can reduce the effects of filter bubbles by engaging internet users in more ideologically diverse communities. Well-designed, collectively-owned, online deliberative fora like Loomio have been empirically shown

to also create a safe environment for marginalised groups. Research suggests that intentionally building more participatory forms of engagement into platforms might reduce filter bubbles, echo chambers and incivility, while increasing communication and deliberative processes.

IMPR O VE TR US T IN DEM O CR A C Y B Y :

The creation, selection and use of online platforms that afford citizen participation and deliberation. Some empirical evidence shows that direct and participatory democratic engagement/processes, e-government, and open government improve trust.

International research has found that engaging citizens in deliberative processes often results in profound changes in deliberating citizens’ “frequently in the direction of more common good-oriented policies.”

Using digital government processes. Transparent, easy to access and well designed

DIGIT AL THRE A TS T O DE MO CR A CY

e-government and open government initiatives have been shown to increase positive feelings and citizen trust in local government.

CREA TE L ONG L AS TING

PR O TECTIONS

FOR PEOPLE OR INTER VENE T O

PR O TECT THEM

2019_321.jpg

REDUCE THE P O WER OF PRIV A TE PL A TFORMS, CO MB A T INCIVILIT Y AND MISINFORMA TION

ONLINE B Y :

Improving Content Moderation. Calls for new regulatory policies around content moderation acknowledge this remains an opaque and difficult practice, and on its own is not a fix-all solution. Current policies at the largest intermediaries attempt to balance stakeholder expectations (including users, consumers, advertisers, shareholders, the general public), commercial business goals, and jurisdictional norms and legal demands

(which are generally governed by liberal-democratic (US) notions of “free speech”). Goals related to inclusive and participatory democracy are not included.

The most common ‘workable solution’ presented as it relates to content moderation, are processes that combine technical and social (human) responses. Advances in semi or fully automated systems, including deep learning, show increased promise in identifying inappropriate content and drastically reducing the number of messages

human moderators then need to review. In the literature however, researchers note that neither automated nor manual classifications systems can ever be “neutral” or free from human bias. Human and/or automated content moderation is unlikely to achieve “civil discourse” or goals through moderation alone. Therefore, the combination of automated classification and deletion systems and human efforts remains the most effective content moderation strategy currently on offer. In the few places where they exist government regulations on private intermediaries’ moderation practices have not been empirically tested for their efficacy or effectiveness.


CO MB A T F AKE NEWS B Y :

A multi-stakeholder content moderation. This is an approach that combines human and technical intervention, however this is a proposed but untested solution.


REDUCE HA TE SPEECH/TR OLLING B Y :

Using identity verification systems. Sites that do not allow anonymisation and force pre-registration have been shown to solicit qualitatively better, but quantitatively fewer, user comments because of the extra effort required for engaging in discussion.

Empirical research has also found that abusive comments are minimised when anonymous commenting is prohibited.

DIGIT AL THRE A TS T O DE MO CR A CY

BUILD

UNDERS T ANDING AND CHANGE

INDIVIDUAL BEHA VIOURS

2019_322.jpg

ADDRE SS SUR VEILL ANCE AND D A T A PRIV A C Y ISSUE S B Y:

Encouraging individuals to employ technical solutions. Such solutions include ad- blockers and ad-tracking browser extensions, private browser options (e.g. Tor), open source platforms and cooperative platform models. “Evidence” supporting the efficacy of these tools and alternatives, however, is typically anecdotal.


CO MB A TE F AKE NEWS B Y :

Education, particularly around critical thinking. Evidence has emerged in health for this approach.


REDUCE HA TE SPEECH B Y :

Building Resilience through Support Networks. Developing fast and effective reporting mechanisms and support networks. A networked approach can effectively combat

the effects of hate speech; by building counter-narratives that counteract racism for example.

Coordinating diverse stakeholders to apply pressure to private intermediaries, in ‘long- haul’ campaigns, has also been effective in having hateful content removed from social media. Speed of removal is considered essential to diffusing the power of hate speech and trolling.


IMPR O VE TR US T IN DEM O CR A C Y B Y :

Civics education. Educating children in schools on ‘good citizenship’ has been positively associated with increased political engagement.


C O N C L U S I O N S


THE THREE CORE PR OBLEMS T O

EMER GE FR O M OUR RE SEAR CH

2019_322.jpg

At the heart of the challenges to democracy posed by digital media are three core problems:


> Platform monopolies: two or three corporations control not only our means of communication, but also the content which is distributed both of which are core aspects of our democracy, whilst the market power and global mobility of these companies make it possible for them to avoid national regulatory measures either by moving operations elsewhere or simply ignoring them;

> Algorithmic opacity: algorithmic engines are using huge quantities of personal data to make ever more precise predictions about what we want to see and hear, and having ever increasing influence over what we think and do, with little transparency about how they work or accountability for their impact; and

> Attention economy: the dominant business model of digital media prioritises the amplification of whatever content is best at grabbing our attention, while

avoiding responsibility for the impact that content has on our collective wellbeing and our democracy. The negative impact is brutally clear from both the literature and the world around us.


2019_323.jpg Y PRINCIPLE S FOR P OLIC Y RE SP ONSE:

2019_318.jpg

Use democratic processes, which provide some degree of transparency about the decisions being made, accountability as to their impacts, and opportunities for challenge and judicial review. These processes must include meaningful participation by diverse representatives of the people whose lives are impacted by digital media. In particular, Internet users and civil society must have meaningful involvement, as the crucial third party in the multi-stakeholder process.

Draw on the evidence as to what is most likely to work, where it exists. Perhaps the most predictable finding of this research is that there has been little or no investment by people in government or other research funders into experimenting with and recording possible solutions, and there needs to be more.

Evidence-led and principled approach. Where there are gaps in the evidence, there are key principles that can be followed to reduce the risk of implementing solutions that do more harm than good. These include an evidence-led focus on ‘upstream’ structural change and the application of human rights principles.

DIGIT AL THRE A TS T O DE MO CR A CY

Focus on structural or ‘upstream’ change. Tackle the structural drivers that underlie all the downstream problems - such as online abuse, disinformation, radicalisation and polarisation. Solutions should be designed to intervene at the structural level and to rebalance power through, for example: governance structures, regulation to restore transparency, accountability and fair competition and genuinely participatory and representative multi-stakeholder processes. None of this is to say that design solutions

and platform affordances are not important. As the research shows, they will be essential. But without some rebalancing of power, without increasing the diversity of people involved in decision-making at the highest levels, those design solutions run the risk of replicating very similar problems to those we now face.

Respect and protect human rights. The following human rights principles should also be applied to policy development in this area:


> Universality: Human rights must be afforded to everyone, without exception.

> Indivisibility: Human rights are indivisible and interdependent.

> Participation: People have a right to participate in how decisions are made regarding protection of their rights.

> Accountability: Governments must create mechanisms of accountability for the enforcement of rights.

> Transparency: Transparency means governments must be open about all information and decision-making processes related to rights.

> Non-Discrimination: Human rights must be guaranteed without discrimination of any kind.

Agile approach. In the absence of a strong evidence base, it makes sense to take an agile, iterative approach to policy change. Experiment with all the policies all the time. Ensure that the funding, design, and implementation of policies reflect a record, learn, and

adapt approach to measure the impact of any new initiatives or regulations, and to make adjustments as evidence becomes available as to impact.


UR GENT AREAS FOR CHANGE

2019_324.jpg

Some of the areas in which action is needed sooner rather than later include effort to:

Restore a genuinely multi-stakeholder approach to internet governance, including rebalancing power through meaningful mechanisms for collective engagement by citizens/users;

Refresh antitrust & competition regulation, taxation regimes and related enforcement mechanisms to align them across like-minded liberal democracies and restore competitive fairness, with a particular focus on public interest media;

Recommit to publicly funded democratic infrastructure including public interest media and the creation, selection and use of online platforms that afford citizen participation and deliberation;

Regulate for greater transparency and accountability from the platforms including algorithmic transparency and great accountability for verifying the sources of political advertising;

Revisit regulation of privacy and data protection to better protect indigenous rights to data sovereignty and redress the failures of a consent-based approach to data management; and

Recalibrate policies and protections to address not only individual rights and privacy but also to collective impact wellbeing. Policies designed to protect people online need to have indigenous thinking at their centre and should also ensure that all public agencies responsible for protecting democracy and human rights online reflect, in their leadership and approaches, the increasing diversity of our country.

DIGIT AL THRE A TS T O DE MO CR A CY


A C K N O W L E D G E M E N T S

This research was made possible by support from our major funder, the New Zealand Law Foundation’s Information Law & Policy Project, with additional research funding from the Luminate Group.

The research team on this project was lead by Marianne Elliott (The Workshop) and included Dr Jess Berentson-Shaw (The Workshop), Dr Kathleen Kuehn (Victoria University of Wellington), Dr Leon Salter (Massey University) and Ella Brownlie (The Workshop). Project management was provided by Jay Brooker (The Workshop).

The quantitative survey was conducted by UMR Market Research.

28

DIGIT AL THRE A TS T O DE MO CR A CY


P U R P O S E ,

D E F I N I T I O N S A N D

M E T H O D


RE SEAR CH PURP OSE

2019_325.jpg

Digital media has been heralded as inherently democratising. People have direct access to each other and to their elected representatives, across geographical and cultural boundaries. But increasingly it is also seen as a space in which democracy may be simultaneously undermined.

As digital technology increasingly permeates society, there is good reason to pay attention to the institutions, policies, and practices that surround this technology and present both opportunities and threats to democracy. This is especially true for

government, as those people who represent the interests of all citizens, but it is also true for everyone with an interest in the future health of our democracy.

The purpose of this research was to explore the opportunities, risks and threats posed to New Zealand’s democracy by digital media, in order to scope future research into the

policy solutions available to New Zealand to maximise the opportunities, and to meet and mitigate the threats.


2019_326.jpgIn order to assess the impact of something like digital media on democracy, you need a

definition of democracy. We used a definition of democracy adapted from the framework developed by the Economist Intelligence Unit for their Democratic Index report, and the definition used by Jamie Bartlett in his book ‘The People vs Tech’.

The five features of democracy in our definition are:

Electoral process and pluralism: including whether elections are free, fair and trusted.

Active citizens: alert, informed citizens who are capable of making important moral judgements, including measures of equity and diversity in representation.

Shared democratic culture: enough societal consensus, cohesion and willingness to compromise for a stable, functioning democracy. In New Zealand, this includes

compliance with te Tiriti o Waitangi, on which our democratic culture is founded.

Civil liberties and competitive economy: a functioning competitive economy and civil society, including protection of human rights and free, independent media.

Trust in authority: a trustworthy government, parliament and judiciary and elected representatives accountable to the people.

Defining democracy is complex. Defining digital media is almost as difficult. Digital media technically includes all digitised content that can be transmitted over the internet or computer networks. This could include text, audio, video, and images. So content from print or broadcast media outlets can fall into this category when it is presented on a website or blog.

The focus of this research was on social media, online forms of communication that people use to share and exchange information with interested audiences, and within that, a specific focus on the major digital platforms, including Facebook, YouTube and Twitter. However, interviewees also talked about the impact of other forms of digital media on democracy, including blogs, online forums and digital forms of traditional media.

DIGIT AL THRE A TS T O DE MO CR A CY

METHOD OL O G Y

2019_327.jpg

This research project was made up of three separate but related strands: expert interviews, a literature review, and a quantitative survey.


EXPER T INTER VIEWS

A series of in-depth interviews were conducted with experts and stakeholders to explore the scope of this issue in more detail, prioritise various aspects of the problem for future research and identify key potential collaborators for further research.

Thirty-five in-depth interviews were undertaken with a selection of experts, stakeholders and users drawn from the following sectors.


> Political

> Policy and official

> Māori-led organisations

> Civil society

> Industry/sector organisations

> Academics, researchers and experts

> International experts.

The interviews were recorded and transcribed and analysed using a hybrid of content and grounded analysis in which some broad themes were used as a starting framework for the analysis, but amended and altered based on the themes that emerged from the data as the analysis progressed.

The full report on the interviews can be accessed here


LITER A T URE REVIEW

The literature was conducted in two parts, one looking at the nature of the opportunities and threats to democracy from digital media, and the other looking at the evidence as to effective solutions and responses.


Part one: Opportunities and Threats

In this narrative literature review, we sought to describe, from the most recent literature (searches were limited to research published in the last eight years and most are

within five), what the nature of the opportunities and threats are to democracy from developments in digital media.

We asked two research questions:


  1. What are the specific opportunities digital media presents for improving democratic participation?
  2. What are the current threats/barriers that are in place to prevent achieving those opportunities?
  3. A non-systematic narrative review was chosen with a view to summarising the themes that have been covered in terms of opportunities and problems (risks and threats). Searches were limited to research published in the last eight years (most are within five).
  4. In total, 110 documents were reviewed including journal articles, reports and book chapters.

Part Two: Solutions

Following on from the review of the literature identifying the opportunities and threats that digital media pose to an inclusive and participatory democracy (Part One), we undertook a review to identify tested and workable solutions to realising the potential of digital media and/or overcoming current threats.

A non-systematic narrative review was chosen with a view to summarising the evidence. Searches were limited to research published in the last eight years (most are within five). It was not an exhaustive review, but in general we found a dearth of empirically tested solutions. This is not surprising given the relatively slow response of government and other public institutions (from where such research would most logically be situated and or funded) to the threats from digital media.

The review is presented in three parts: 1) the empirical evidence on workable solutions to threats to democracy from digital media, 2) a summary of recommendations found in the literature and 3) a brief discussion of some activities identified in New Zealand

The full literature reviews can be accessed here.

QUANTIT A TIVE SUR VE Y

The third component of this research was an online survey among a nationally representative sample of New Zealanders aged 18 years and over. The survey was designed to elicit the views and experiences of people using social media and digital platforms relevant to democracy (e.g. participating in debates about issues of public policy on social media.)

1,000 people were surveyed, weighted to accurately reflect the New Zealand population in relation to region, age, gender and ethnicity. Fieldwork was carried out from the 27th of September to 2nd of October 2018.

The full report on the survey can be accessed here.

DIGIT AL THRE A TS T O DE MO CR A CY

O VER VIEW

2019_328.jpg

Over recent years a growing body of international research has looked at the impact of digital media on democracy, with particular focus on the US and the UK where the role played by digital media in the election of Trump and the Brexit referendum raised significant concerns.

This project was designed to find out if we should be worried about these same issues here in New Zealand. And if so, what should we do about it? In order to answer that question we identified five key features of democracy against which we could measure the impact of digital media, for better and for worse. They are:


> Electoral process and pluralism

> Active, informed citizens

> Shared democratic culture

> Civil liberties and competitive economy

> Trust in authority

WHA T WE’ VE FOUND

Critically, we found that digital media is having an impact across every one of those features of a healthy democracy.

There are indicators that digital media has had some beneficial impacts. Our quantitative research here in New Zealand indicates, for example, that people from minority groups have been able to use digital media to participate in democratic processes including accessing politicians and engaging in public debate. Whatever our response to the challenges posed to democracy by digital media, it’s important we don’t lose these opportunities in the process.

But the overall trend should raise serious concerns. Active citizenship is being undermined in a variety of ways. Online abuse, harassment and hate - particularly of women, people

of colour, queer people, people with disabilities and people from minority religions - undermines democratic participation not only online, but offline.

Misinformation, disinformation and mal-information are undermining not only informed debate, but also public trust in all forms of information. Distraction and information overload are eroding citizens’ capacity to focus on important and complex issues, and their capacity to make the ‘important moral judgements’ required in a functioning democracy.

Likewise, interviewees described a myriad of ways in which our shared democratic culture is being undermined by digital media - including through disinformation, polarisation, attention hijacking and radicalisation.

One of the clearest impacts of digital media on our democracy has been its impact on funding for mainstream media. While Facebook and Google hoover up the advertising revenue that once would have been spent on print, radio and television advertising, they contribute nothing to the work of producing the kind of news and current affairs reporting that is essential to a functioning democracy. In a stunning display of hypocrisy, Facebook recently complained that their local news service was being hindered by a lack of local newspapers, many of which were forced to either shut down or significantly reduce their newsroom size after losing advertising income to Facebook.

The representative survey we carried out indicates that New Zealand’s small size and relatively healthy mainstream media (relative to elsewhere and despite significant resource challenges) may help us avoid the worst effects of “filter bubbles” and “echo chambers” in digital media on some issues.

When asked about the legalisation of cannabis, New Zealanders who got their information about the issue online were able to predict relatively accurately whether the majority of New Zealanders shared their views or not. A third of those who disagreed could predict (that is a minority), most who agreed could accurately predict. This may be unique to the debate about drug reform because, for example, there had been significant media coverage of opinion polls on this issue. More research would be needed to see if this is replicated across other issues in New Zealand.

Interviewees in our qualitative research nonetheless pointed to examples where debate in New Zealand about issues like free speech, hate speech and gender identity attracted the attention of foreign actors holding strong, even extreme, views on these issues. Engagement by these foreign actors in the online public debates on issues here in New Zealand appears

to some interviewees to have contributed to a polarisation, even radicalisation of views here. Interviewees also raised concerns that the ability of citizens to form free and informed opinions were being undermined not only by mis and disinformation, but by the increasing role of algorithms in predicting and curating the information each of us is exposed to.


THE NEED FOR A S Y S TEMIC RE SP ONSE

We could continue to outline the impact digital media is having on trust in public institutions, free and fair elections, the protection of human rights and a competitive economy. More

on all of that below. The key message is clear, digital media is having massive, system-wide impacts on our democracy. It affects every part of our lives and the people who run the corporations controlling the major platforms are having a determinative impact on the very structures and functions of our society. While better content moderation is clearly one of the responses we must demand of the platforms, it is not even close to being a sufficient response to the scale of the challenge.

DIGIT AL THRE A TS T O DE MO CR A CY

THE THREE CORE PR OBLEMS T O EMER GE FR O M OUR RE SEAR CH

At the heart of the challenges to democracy posed by digital media are three core problems:


  1. Platform monopolies: two or three corporations control not only our means of communication, but also the content which is distributed both of which are core aspects of our democracy, whilst the market power and global mobility of these companies make it possible for them to avoid national regulatory measures either by moving operations elsewhere or simply ignoring them;
  2. Algorithmic opacity: algorithmic engines are using huge quantities of personal data to make ever more precise predictions about what we want to see and hear, and having ever increasing influence over what we think and do, with little transparency about how they work or accountability for their impact; and
  3. Attention economy: the dominant business model of digital media prioritises the amplification of whatever content is best at grabbing our attention, while avoiding responsibility for the impact that content has on our collective wellbeing and our democracy. And the negative impact is brutally clear from both the literature and the world around us.

It’s critical that this moment of global cooperation is used to address the wider, structural drivers of the biggest threats posed to democracy by digital media. These structural drivers include the power that a handful of privately-owned platforms wield over so many aspects of our lives, from what information we see, who we interact with, and who can access information about us. And we must do this while maintaining and building upon the many opportunities digital media simultaneously offer to tackle some of the biggest challenges facing democracy, including inequity of access and declining engagement.

In order to do that, action is needed sooner rather than later in order to:


> Restore a genuinely multi-stakeholder approach to internet governance, including rebalancing power through meaningful mechanisms for collective engagement by citizens/users;

> Refresh antitrust & competition regulation, taxation regimes and related enforcement mechanisms to align them across like-minded liberal democracies and restore competitive fairness, with a particular focus on public interest media;

> Recommit to publicly funded democratic infrastructure including public interest media and the creation, selection and use of online platforms that afford citizen participation and deliberation;

> Regulate for greater transparency and accountability from the platforms including algorithmic transparency and accountability for verifying the sources of political advertising;

> Revisit regulation of privacy and data protection to better protect indigenous rights to data sovereignty and redress the failures of a consent-based approach to data management; and

> Recalibrate policies and protections to address not only individual rights and privacy but also collective dynamics and wellbeing, and protect indigenous rights. Public agencies responsible for protecting democracy and human rights online should reflect, in their leadership and approaches, the increasing diversity of our country.

T H E

O P P O R T U N I T I E S O F D I G I T A L

M E D I A

The potential of digital democracy lies in its ability to increase democratic participation, embrace diversity of opinion, and empower marginalised groups. We identified six clear opportunities from the literature that digital media offers. These are: the democratisation of information publishing, broadening the public sphere, increasing equality of access

to and participation within political processes, increasing participation and engagement in political processes, increasing transparency and accountability from government and promotion of democratic values.

Broadly speaking these opportunities fit into two categories: First, those that enable individuals, citizens or groups, who due to their status in society have been excluded from fully participating in different aspects of the democratic process, through greater access to the levers of democracy. Examples include the use of digital media to: broaden the public’s engagement with indigenous people and their lives, give more exposure to women in politics, build well-networked, educated and empowered communities, and encourage political engagement from youth.

The second category of opportunities relate to digital media’s use by people in governments to make the processes of democracy more inclusive, to increase engagement with citizens, improve transparency of government work, and rebuild trust in democratic processes. Examples of such work include online deliberative democracy processes, open or e-government initiatives, and funding of public service journalism, platformed on digital media.

DEM O CR A TIS A TION OF INFORMA TION

PUBLISHING

Digital media enables the creation and sharing of content by anyone. This aspect in particular, the literature shows, has the potential to improve democratic participation by facilitating dialogue both between governments and citizens (improving institutional trust) and between otherwise divergent groups and individuals in society.


2019_329.jpg O ADENING OF

THE PUBLIC SPHERE

The literature suggests that digital media can be used to widen policy conversations to include marginalised individuals and communities who have been previously excluded from democratic processes. A good example of this is the @IndigenousX Twitter account.


2019_330.jpg

EQUALIT Y OF

A CCE SS T O AND P AR TICIP A TION

WITHIN P OLITICAL PR O CE SSE S

2019_331.jpg

Several studies found that digital media increases equality of access to and participation within political processes, in terms of gender, class, race and age. Specifically digital and social media:


> gives more positive exposure to women politicians than traditional news media;

> builds well-networked, educated and empowered communities, which may previously have been economically and socially marginalised by digital divides, (when incorporated with other good government policies such as civics education),

> facilitates the formation of both ‘ad-hoc’ and longer-term, group-based online communities focused on fighting racism, which can provide a safe space of belonging for ethnic minority groups.

> softens political inequality patterns by encouraging political engagement from 16-29 year olds.

2019_332.jpg

ENG A GEMENT IN P OLITICAL PR O CE SSE S

Numerous studies found links between digital media and increased engagement in political processes in the general population, not only in marginalised groups. This includes engagement in elections, different forms of deliberative democracy, as well as participation in more informal political action such as protests.

DIGIT AL THRE A TS T O DE MO CR A CY

2019_333.jpg

INCREASING

TR ANSP ARENC Y AND A CCOUNT ABILIT Y

FR O M GO VERNMENT

While digital media has the potential to help rebuild trust in public trust in democratic institutions and policies using “open government” and “e-government” initiatives, the research suggests it is likely dependent on the base level of trust (i.e such approaches may be more effective in low trust environments where there is less initial transparency).

2019_318.jpg


PR O M O TION OF DEM O CR A TIC

V AL UE S

2019_321.jpg

People in governments have been able to actively promote democratic values, informed debate, tolerance and respect for other groups using digital media. Examples of direct action include government funding of public service journalism, funding of independent statutory organisations such as All Together Now in Australia, which encourages embracing of cultural diversity. Less direct action includes the use of digital media to promote participatory democracy activities, e.g deliberative forums.


  1. 2019_334.jpgY The opportunities for digital media are significant and important. If used well, digital media can enable governments to respond effectively to the experiences of marginalised groups, to ensure equitable policies and practices are designed, delivered and adjusted, and to build trust in the democratic institution as responding to the needs of all people. It offers as much to people pushing against barriers to their progression, inclusion, and improved wellbeing in society as it does to people in government looking to remove those barriers and build a more inclusive democratic system.

DIGIT AL THRE A TS T O DE MO CR A CY


T H E

T H R E A T S

The threats to this promise outlined in the literature are significant however, and most are intricately bound up with the concentration of power in profit-driven companies. The seven key threats we identified to inclusive democracy from digital media were: the increasing power of private platforms, foreign government interference in democratic processes, surveillance and data protection issues, fake news, misinformation and disinformation, filter bubbles and echo chambers, hate speech and trolling, and distrust/dissatisfaction with democracy.

Some of these threats or problems originate in the structures and systems of society e.g the power of private platforms over people’s lives. Others operate at an individual level e.g. a growing distrust of democracy. However, all these threats are interconnected. Together they threaten to derail the democratic promise of digital media.

INCREASING P O WER OF

PRIV A TE

PL A TFORMS

2019_335.jpg

Private platforms have increasing power to determine all aspects of our access to information, social interactions, and democratic activities. Researchers highlighted the increasing dominance of an increasingly small number of privately-owned platforms over the internet. People who own and control these platforms have a monopolisation

tendency linked to the relationship between the mining of user-data and their imperative to make profits. This model of operation is termed “platform monopolies”. The monopolisation tendency makes alternatives to the data-extraction for profit model, for example co-operative, democratised ownership models, hard to start up and survive.

The concentrated power of these platforms shapes not just the wider information context and ability to develop alternative non-extractive models of digital information provision and sharing, but individual’s personal lives also. Platform monopolies affect how and with whom we interact socially through algorithms. A body of literature points to the actions that these people in this companies take that impact human rights, both through the control of personal data and the level of control over what appears in the public sphere.

From this model of platform monopolies flows a series of further threats to democracy. Some relate to the features of the platforms, directly linked to the capture of people’s personal data. The collection and on-sale of personal data by these platforms, to both governments looking to undertake surveillance on their own citizens, and private organisations wishing to make profits, erodes public trust in information systems, and curtails the professional work of the media and writers - a key plank in our democracy.


2019_336.jpg

GO VERNMENT

INTERFERENCE

IN DEM O CR A TIC PR O CE SSE S

The literature shows that the interference in democracy, specifically through the use and manipulation of digital and social media contributes to decreased turnout and voter disengagement. Disinformation campaigns by foreign governments exaggerate already existing tensions and polarisations and encourage a lack of faith in the electoral system and lack of trust in the idea of liberal democracy.

2019_337.jpg

SUR VEILL ANCE AND D A T A CAPT URE

2019_338.jpg

Evidence shows that the provision of private data to both governments for the purposes of surveillance and private organisations for profit-making by platforms has a curtailing effect on key components of democratic function. Specifically, there has been a demonstrated curtailing effect on the private and public practices of both writers

and journalists. While surveillance of Muslim communities for example contributes to alienation from mainstream society.


2019_339.jpg

MISINFORMA TION AND

DISINFORMA TION

2019_340.jpg

The creation of the“attention economy” also poses a significant threat. People’s propensity to attend to shocking, false, or emotive information, especially political information, is exploited and used as a commodity product by digital media platforms.

The literature shows that governments with the means and inclination to manipulate information can tailor false information towards individuals with the express intent of interfering in other countries democratic processes, for example Russian government interference in the U.S election of 2016 using ‘bots’ and disinformation campaigns.

While misinformation and disinformation, especially political disinformation, targeted at individuals on digital media, is used to influence politics, from national elections through to information provision and sharing with regard to political issues and policy more generally. Political misinformation in particular has been found to have a significant direct and indirect impact on democratic participation and engagement.

DIGIT AL THRE A TS T O DE MO CR A CY


FIL TER BUBBLE S AND ECHO CHAMBERS

2019_320.jpg

Filter bubbles are specific technical effect of the attention economy. Facebook’s news feed is a filter bubble, created by a machine-learning algorithm which draws on data created by user networks, likes and comments and how much organisations are willing to pay to be present there.

Filter bubbles follow a longer-term trajectory within advertising (including political advertising and now disinformation) which has sought to collect data in order to tailor adverts to target groups, however, now they can be targeted to specific individuals.

This can contribute to the formation of echo chambers, which is the reinforcement of existing beliefs (confirmation bias) through selective exposure to information. Hence, the technical and economic drivers of filter bubbles can act to reinforce echo chambers.

Increasing numbers of automated social media ‘bots’ have been linked with the spread of political disinformation and thus the reinforcement of echo chambers. Filter bubbles and the related echo chambers they feed into are linked to a decline in trust in the ability to traditional news media to provide reliable information. They have been found to exacerbate political divisions and polarisation, and have negative implications for the

mechanisms of liberal democracy. Developing a broad consensus around decisions made in the public good becomes increasingly difficult.


2019_342.jpgSPEECH AND TR OLLING

2019_343.jpg

The rise of hate speech and trolling is linked to polarising effects of filter bubbles and echo chambers. A troll is an anonymous user who deliberately provokes antagonistic reactions for sheer enjoyment. Trolling is aided both by the ease of creating anonymous online profiles and by the atomised nature of internet interaction. Trolling can pose a direct threat to opportunities when it becomes systematically targeted towards minority groups in order to deliberately cause emotional distress. Remaining anonymous makes individuals more likely to escape prosecution for the more egregious examples

Racialised hate speech (otherwise known as cyber racism) is specifically targeted towards ethnic minority groups, and has become increasingly coordinated in recent years, through the rise of the “alt-right”. It encourages affected groups to retreat to safe locations, rather than engaging with national debates and institutions.

Sexualised hate speech is primarily targeted towards women (together with members of the LGBTQI community), and is characterised by its specifically misogynistic nature. It is often directed towards women in the public eye, or those in influential positions, such as journalists, with proponents directing critical attention onto their supposed essential

gender characteristics, rather than their work. It has a negative impact on efforts towards the broadening of the public sphere, as women are discouraged from writing what they may feel are controversial stories.

More generally, research has found a correlation between strong, vocal disagreements with an individual’s perspectives and a “spiral of silence” which acts to curtail the voicing of contentious opinions by minority groups. The particular ability of trolls and hate speech to fan antagonistic “flames” rather than promote rational debates, has a direct impact on democratic participation.

DIS TR US T /

DISS A TISF A CTION WITH DEM O CR A C Y

2019_319.jpg

While distrusts with democratic process is a longer-term issue, digital media has likely exacerbated this pattern across western democracies. Researchers argue that trust, informed dialogue, mutual consent, and participation- fundamental features of democracy are being eroded by the features that make social media so profitable.

Researchers also found that the way in which the information is distributed on digital media (horizontal, and decentralised and interactive) increases intolerance of others, polarisation and scepticism toward democracy


2019_344.jpg

SUMMAR Y

2019_345.jpg

The opportunities of digital media, while still apparent, appear to have been suppressed by the sheer weight of fake news, filter bubbles, populism, polarisation, hate speech, trolls and bots, that have emerged from the concentration of power in a small group of private organisations seeking to maximise profits. Digital platforms initially celebrated for their democratic possibilities have transformed into anti-democratic power centres through the collection and exploitation of users attention and data. These privately owned platforms have largely escaped public oversight or regulation over their ability to harness this new power for commercial or political gain.

The question is what can policymakers do to re-calibrate? Are there empirically tested public policies and approaches that can ensure digital media works to strengthen and deepen democracy?


T H E

S O L U T I O N S


A HIER AR CHY FOR SOL UTIONS AND

INTER VENTIONS

2019_346.jpg

At The Workshop, we take an evidence-informed, hierarchical approach to exploring and understanding problems, and investigating and analysing solutions, policies and practices to overcome them. We work especially to highlight the critical role of structures and systems in improving people’s lives with the least individual effort required (though not the least political effort).


IDENTIFYING DRIVERS OF THE PR OBLEM

First we ask is the problem we have identified a structural or systems level problem (e.g, the structure of the economic model, the power of private markets over people’s wellbeing) or a group/individual level response to the issue (e.g. distrust in democracy

that results from a lack of inclusion in democracy)? Sometimes defining where problems originate is complex as there are interactions and feedback loops, as with all complex issues.

For example, hate speech is an individual or group behaviour, it is fundamentally about how people or institutions treat others, however the upstream issues that encourage and enable hate speech, intolerance and bigotry must be explored. Wealth, gender and ethnic inequalities in society, for example, mean digital platforms are primarily owned, designed and managed by those with little experience of differential or harmful treatment based on their position in society.

Without knowledge of how power imbalances and differential treatment based on gender or race play out in society, or a commitment to overcoming them, people who control these platforms can design in policies and practices that encourage hate speech and trolling. By presenting problems in a hierarchy we endeavour to make the feedback loops and upstream structures and systems issues clearer to people.


IDENTIFYING WHERE PEOPLE SHOULD INTER VENE FOR GREA TE S T IMP A CT

In terms of considering “what works”, we focus on ‘upstream’ or structural and systems responses and solutions to the problems. We take this approach because research from across disciplines focussed on enhancing population wellbeing and equity shows interventions at this level:

> have the most significant impact on most people’s lives and outcomes, and

> require the least effort from individuals to achieve change, and the least resources from those trying to implement change.

We place less emphasis on individual behavioural solutions, not because they are not effective, but because to be effective these solutions (e.g civics education, or consumers closing their Facebook accounts) take significant effort from both individuals and those encouraging such action, and may not address the structural drivers that cause the problems upstream. In addition, people expending energy on individual level solutions can divert energy from investing in understanding and acting on structural level solutions.

The possible interventions identified both in the literature review and by the interviewees are discussed in the context of a hierarchy from those likely to be most effective and requiring least individual effort, to those likely to have the least impact and requiring most individual effort on a population-wide scale. This hierarchy comprises interventions that:


> Change society-wide structural & systems issues to re-establish citizen power

> Create supportive environments & contexts - making the default digital space inclusive and safe

> Create long-lasting protections for people, and intervene to protect them from digital threats

> Build understanding of digital media threats and change individual behaviours in response.

Wellbeing Impact Pyramid

Increasing population wide impact

Increasing individual effort needed

DIGIT AL THRE A TS T O DE MO CR A CY


Build people’s

understanding

+ change behaviours

Create long lasting protections

for people

+ intervene to protect them

Create supportive environments

Change society wide systems + structures

Adapted by The Workshop from Frieden (2010)


F I N D I N G S

While our literature review was not exhaustive, in general we found a dearth of empirically tested solutions. Likewise, and possibly because of the lack of evidence in the literature, the experts interviewed for this research generally had more to say about the risks and threats they saw arising from digital media than they did on potential workable solutions. However, we did find some common ground between the literature and in the interviews, in terms of solutions.

In line with The Workshop’s evidence-led approach set out above, we discuss what empirical evidence we did find in

a hierarchy, starting with those solutions likely to have the greatest impact and require least individual effort.

DIGIT AL THRE A TS T O DE MO CR A CY

CHANGE SO CIET Y - WIDE S TR UCT UR AL & S Y S TEMS ISSUE S T O RE-E S T ABLISH CITIZEN P O WER.

2019_319.jpg

This section focuses on structural and systemic change, addressing for example the disproportionate power of the tech giants vis-a-vis governments, citizens and their domestic competitors.


REDUCE THE P O WER OF PRIV A TE PL A TFORMS B Y :

Regulating platforms like other industries. Currently, regulatory debates largely centre around defining the structure, terms and conditions of what kind of industry private intermediaries represent. How platforms should be regulated or governed thus partly hinges on how these services are defined; for example, whether social media platforms are media companies, public spaces, utilities or some other service largely informs how they can ultimately be governed. There is little or no empirical evidence to show how regulation in this area would or would not work, and therefore adaptive approaches to policy and regulation will be needed. This will involve ensuring that the impacts of any change are regularly monitored and changes made as needed in response to those findings.

Introducing new modes of collective action. Under industrial capitalism we had collective bargaining, the strike – e.g., forms of collective action that were sanctioned by law and had the support of a society that allowed people to tame capitalism with legal protection. In relation to digital media researchers suggest there are opportunities for more collective action both by tech workers, demanding for example more ethical design in the products they work on, and by digital media users. New forms of collective, collaborative action that connect users/consumers with the market and state to tame and outlaw surveillance capitalism are suggested by multiple researchers, but again there is no empirical testing to yet drawn upon.


CO MB A T F AKE NEWS B Y :

Supporting a vibrant and diverse media sphere. One that balances strong, independent and adequately resourced public service media with a non-concentrated commercial media sector. Although there is an existing body of research showing the positive impact of a vibrant and healthy public and independent media on democracy, the specific impact of investing in media in the context of digital media is a widely proposed but as yet unmeasured idea.

CREA TE SUPP OR TIVE ENVIR ONMENTS &

CONTEXTS - MAKING THE DEF A UL T

DIGIT AL SP A CE

INCL USIVE AND S AFE

2019_320.jpg

REDUCE THE P O WER OF PRIV A TE PL A TFORMS B Y :

Designing new competitive digital media solutions. Disruptive technology is needed to forge an alternative digital future that in turn, facilitates a more democratic internet. This means the creation of platforms offering a different set of affordances (ie not those driven by platform monopolies). Platform cooperatives like Loomio subscription-based models and pro-privacy and non-commercial alternatives are already in use and show some evidence of effectiveness in the literature.


REDUCE INTERFERENCE FR O M FOREIGN GO VERNMENTS AND P O WERS B Y :

Designing new anti-cybersecurity infrastructure and drawing upon “big datasets’ to review and assess electoral policies. The research in this area is also largely normative, and seems to generally prescribe such infrastructure and reviews will reduce threats to elections and other political processes.


ADDRE SS SUR VEILL ANCE & IMPR O VE D A T A PR O TECTION B Y :

Regulating companies’ information management practices. Some regulatory measures, like the Singaporean Data Protection Act 2012, work to and have been proven effective in bringing formal charges to data mismanagement and abuse.

Making regulatory changes to data privacy policies. However, there is little evidence to suggest that these changes will reduce surveillance/data collection so much as regulate how that data is stored, accessed and used by data collectors and other third parties


CO MB A T F AKE NEWS B Y :

Developing and circulating persuasive counter-narratives. The focus would need to be on emotional not rational, appeal. This is proposed but unmeasured.


O VER CO ME FIL TER BUBBLE S/ECHO CHAMBERS B Y :

Supporting new platform designs with different design affordances

Design affordances ascribe meaning to how to use the digital media tool, for example Facebook has a “friend” button directing the user towards ways of interacting based on mutual agreement, also a “share” button, while Twitter has a “follow” button, open to all people using the platform, directing or suggesting different ways of interacting. The design of these affordances has an impact on inclusion and participation, as well as the types of interactions people experience and information they are exposed to. There

is some suggestion that design affordances can reduce the effects of filter bubbles by engaging internet users in more ideologically diverse communities.

Non-commercial platforms like Loomio, for example, afford different modes of interaction based on the features (e.g., tools, interface) and environment (e.g., deliberative; asynchronous) it makes available outside a commercial space. The platform affords an environment of less performative, and thus more considered, dialogue, discussion and

DIGIT AL THRE A TS T O DE MO CR A CY

debate. The relationship between design and civility on these new platforms has been empirically demonstrated, shown a reduced propensity to engage with similar-minded people encouraged by automated filter bubbles and to move deliberation beyond debate to collective agreement.


O VER CO MING SILENCING EFFECTS OF HA TE SPEECH B Y:

Supporting new platform designs with different design affordances. Well-designed, collectively-owned, online deliberative fora like Loomio have been empirically shown to also create a safe environment for marginalised groups.

Research suggests that intentionally building more participatory forms of engagement into platforms might reduce filter bubbles, echo chambers and incivility (particularly on mobile devices), while increasing communication and deliberative processes. Therefore, the act of consciously designing social platforms to engender pro-social forms of engagement can have a demonstrated impact on civility


IMPR O VE TR US T IN DEM O CR A C Y B Y :

The creation, selection and use of online platforms that afford citizen participation and deliberation. Some empirical evidence shows that direct and participatory democratic engagement/processes, e-government, and open government improve trusts.

International research has found that engaging citizens in deliberative processes often results in profound changes in deliberating citizens’ “frequently in the direction of more common good-oriented policies”, but for them to be effective the systems and platforms used in these deliberative processes must also enable these practices to emerge. The techno-social affordances inherent to different online platforms affect and shape the nature of engagement, deliberation and discussions.

Using digital government processes. Transparent, easy to access and well designed

e-g overnment and open government initiatives have been shown to increase positive feelings and citizen trust in local government. Some evidence shows governments that have created usable, intelligible websites, and offer non-exclusionary solutions for those lacking computer and internet access or basic digital literacy skills, have been most successful in their e-government initiatives and constituent satisfaction.

CREA TE L ONG- L AS TING

PR O TECTIONS

FOR PEOPLE OR INTER VENE T O

PR O TECT THEM FR O M DIGIT AL THREA TS

2019_347.jpg

REDUCE THE P O WER OF PRIV A TE PL A TFORMS,

CO MB A T INCIVILIT Y AND MISINFORMA TION ONLINE B Y:

Improving Content Moderation. Calls for new regulatory policies around content moderation at large intermediaries acknowledge content moderation remains an opaque and difficult practice, and on its own is not a fix-all solution. Current policies at the largest intermediaries attempt to balance stakeholder expectations (including users, consumers, advertisers, shareholders, the general public), commercial business goals, and jurisdictional norms and legal demands (which are generally governed by liberal- democratic (US) notions of “free speech”), goals related to inclusive and participatory democracy are not included.

The most common ‘workable solution’ presented as it relates to content moderation are processes that combine technical and social (human) responses. However, advances in semi- or fully automated systems, including deep learning, show increased promise

in identifying inappropriate content and drastically reducing the number of messages human moderators then need to review. In the literature however, researchers note that neither automated nor manual classifications systems can ever be “neutral” or free from human bias. Human and/or automated content moderation is unlikely to achieve “civil discourse,” a “sanitised” internet or other speech and engagement goals through moderation alone. Therefore, the combination of automated classification and deletion systems and human efforts remains the most effective content moderation strategy currently on offer. In the few places where they exist government regulations on private

intermediaries’ moderation practices have not been empirically tested for their efficacy or effectiveness.


CO MB A T F AKE NEWS B Y :

A multi-stakeholder content moderation. This is an approach that combines human and technical intervention, however this is a proposed but untested solution.


REDUCE HA TE SPEECH/TR OLLING B Y :

Using identity verification systems. Sites that do not allow anonymisation and force pre-registration have been shown to solicit qualitatively better, but quantitatively fewer, user comments because of the extra effort required for engaging in discussion.

Empirical research has also found that abusive comments are minimised when anonymous commenting is prohibited.

DIGIT AL THRE A TS T O DE MO CR A CY

BUILD

UNDERS T ANDING OF DIGIT AL

MEDIA THREA TS AND CHANGE

INDIVIDUAL

BEHA VIOURS IN RE SP ONSE .

2019_322.jpg

REDUCE THE P O WER OF PRIV A TE PL A TFORMS B Y :

Building citizen-consumer activism and creating a “sea change in public opinion”. Scholars and theorists suggest that a shift in public attitudes is needed to persuade digital media companies to change, there is however no empirical data to draw upon as to how effective this approach would or would not be.


ADDRE SS SUR VEILL ANCE AND D A T A PRIV A C Y ISSUE S B Y :

Encouraging individuals to employ technical solutions. Such solutions include ad-blockers and ad-tracking browser extensions, private browser options (e.g. Tor), open source platforms and cooperative platform models. “Evidence” supporting the efficacy of

these tools and alternatives, however, is typically anecdotal or prescriptive in nature (as opposed to empirical).


CO MB A T F AKE NEWS B Y :

Education, particularly around critical thinking. Evidence has emerged in health for this approach.


REDUCE HA TE SPEECH B Y :

Building Resilience through Support Networks. Developing fast and effective reporting mechanisms and support networks, e.g Advocacy and civil society organisations like All Together Now, have demonstrated some success with building online reporting tools that rely on crowdsourcing to identify – in order to remove - racist hate speech online. A networked approach can effectively combat the effects of hate speech; by building counter-narratives that counteract racism for example.

Coordinating diverse stakeholders to apply pressure to private intermediaries, in ‘long- haul’ campaigns, has also been effective in having hateful content removed from social media. Speed of removal is considered essential to diffusing the power of hate speech and trolling. Pressure from researchers and advocacy groups alike have also encouraged some platforms to design more pro-social tools (i.e., affordances) into their systems.


IMPR O VE TR US T IN DEM O CR A C Y B Y :

Civics education. Educating children in schools on “good citizenship” has been positively associated with increased political engagement.


W H A T R O L E

F O R N Z ?

All of this raises the question: what role should New Zealand play in the wider global efforts to respond to the challenges of

digital media? Some interviewees argue that New Zealand should follow the lead of bigger, like-minded liberal democracies like the United Kingdom, the European Union and Australia. Others thought New Zealand could, and should be leading on these issues. Some saw specific opportunities for New Zealand to provide leadership in niches, like indigenous data sovereignty.

One thing many people agreed on was a sense of urgency – an urgency which has increased considerably in the months since many of these interviews took place.

As one participant put it, “we’ve got some really resounding early warning signals about how this stuff can be used to erode our democratic institutions, and if we don’t sit up and take notice of it, and don’t provide the necessary technical, social, and regulatory responses, we might wake up and find that we’ve missed the opportunity.”

NEW ZEAL AND AS FOLL O WER

2019_348.jpg

One of the common reasons given for taking the path of following the lead of others was New Zealand’s size. However another, perhaps more critical, argument was that New Zealand would need a much better system for making policy on these issues before we can be any kind of global leader. Before we can lead, this participant argued, we need to build up our national capacity to understand and deal with these issues, and build up more of an evidence base, we need to be equipped to have a solid base for developing policy ourselves. It’s hoped this research can help contribute to that process.


2019_323.jpgZEAL AND AS LEADER

2019_348.jpg

On the other hand, some interviewees asked why New Zealand should be a ‘taker’ of policy on these issues, and identified a great opportunity for New Zealand to team up with other like-minded democracies. We are typically at the cutting edge of technology, they argued, so why not take a lead on this. Digital media has brought advantages to New Zealand, they argued, so we want to make sure that we don’t lose the upsides of the new digital economy. Playing a leading role in the global response to the threats of digital media can help ensure that we do not.


2019_349.jpg

NEW ZEAL AND AS NICHE INFL UENCER

2019_350.jpg

Some interviewees pointed to New Zealand’s track record of taking a principled stand on big global issues, giving our nuclear-free policy as an example. One example given as an area in which New Zealand could show leadership is in the development of a tech workers’ union. Because New Zealand has comparatively better employment protections than many other places where tech people work, they said, we already have less of the fear of speaking up. We also have a small enough sector where personal relationships can very easily be brought to bear on these situations.

Finally, but perhaps most importantly, some interviewees argued that there was a role for New Zealand to play as a leader on indigenous data sovereignty and issues relating to Māori digital issues. This would first require us to address the significant gaps in our

own protection of indigenous rights online. One of the most critical issues is the need to protect indigenous data sovereignty, allowing Māori ownership and control of Māori data.


C O N C L U S I O N S

At the heart of the challenges to democracy posed by digital media are three core problems:


  1. Platform monopolies: two or three corporations control not only our means of communication, but also the content which is distributed both of which are core aspects of our democracy, whilst the market power and global mobility of these companies make it possible for them to avoid national regulatory measures either by moving operations elsewhere or simply ignoring them;
  2. Algorithmic opacity: algorithmic engines are using huge quantities of personal data to make ever more precise predictions about what we want to see and hear, and having ever increasing influence over what we think and do, with little transparency about how they work or accountability for their impact; and
  3. Attention economy: the dominant business model of digital media prioritises the amplification of whatever content is best at grabbing our attention, while avoiding responsibility for the impact that content has on our collective wellbeing and our democracy. And the negative

impact is brutally clear from both the literature and the world around us.

Combined, these problems pose serious threats to our democracy, so it’s critical that our responses to them don’t further undermine our democratic institutions. The history of digital media has shown that good intentions can, if not informed by the diverse experiences of users and the research evidence, cause more harm.

INCL USIVE AND TR ANSP ARENT

PR O CE SSE S ARE CRITICAL

2019_351.jpg

Firstly, as Natasha Tusikov and Blayne Haggart have argued, decisions about what kinds of information we have access to should not be made by a handful of American companies. Nor should our government’s role in those decisions take place in backroom negotiations.

We need to use democratic processes, which provide some degree of transparency about the decisions being made, accountability as to their impacts, and opportunities for challenge and judicial review. These processes must include meaningful participation by diverse representatives of the people whose lives are impacted by digital media.


2019_352.jpgORE RE SEAR CH NEEDED

2019_353.jpg

Secondly, the stakes are high here, so we must draw on the evidence as to what is most likely to work, where it exists. Perhaps the most predictable finding of this research is that there has been little or no investment by people in government or other research funders into experimenting with and recording possible solutions, and there needs to be more. A list of possible areas for further research is included in the full report.

It’s not surprising that there is so little experimental evidence as to the effectiveness of various solutions proposed in the normative literature. Change happens very quickly in this area. Until very recently there has been little investment in research from governments, which would be one of the expected sources of funding for such investigations. More is now urgently needed.


2019_349.jpg

EVIDENCE-LED

AND PRINCIPLED APPR O A CH T O

UR GENT P OLIC Y DEVEL OP MENT

2019_353.jpg

Even in the absence of specific evidence as to the effectiveness of different interventions, there are areas in which action is urgently needed. In those cases, there are key principles that can be followed to reduce the risk of implementing solutions that do more harm than good. As a primary principle, we can take an evidence-informed, hierarchical approach to exploring and understanding problems, and investigating and analysing solutions, policies and practices to overcome them. This involves identifying underlying drivers of the problem, and those interventions which are most likely to have the greatest impact.

We set these principles out in more detail in the section on solutions above, but the key point is that we need to focus on tackling the structural drivers that underlie all the more specific problems outlined above - such as online abuse, the spread of disinformation, radicalisation and polarisation, political interference and manipulation or distraction.

Solutions should then be designed to intervene at that structural level addressing and rebalancing power through, for example, governance structures, regulation to restore transparency, accountability and fair competition and genuinely participatory and representative multi-stakeholder processes.

None of this is to say that design solutions and platform affordances are not important. As the research shows, they will be essential. But without some rebalancing of power, without increasing the diversity of people involved in decision-making at the highest levels, those design solutions run the risk of replicating very similar problems to those we now face.

APPL Y HUMAN

RIGHTS PRINCIPLE S

2019_325.jpg

Human rights principles should also be applied to policy development in this area, and are particularly useful where there is an absence of specific research evidence. These principles include:


> Universality: Human rights must be afforded to everyone, without exception.

> Indivisibility: Human rights are indivisible and interdependent, which means in order to guarantee civil and political rights, governments must also ensure economic, social and cultural rights (and vice versa).

> Participation: People have a right to participate in how decisions are made regarding protection of their rights. Governments must engage and support the participation of civil society on these issues.

> Accountability: Governments must create mechanisms of accountability for the enforcement of rights. There must be effective measures put in place for accountability if those standards are not met.

> Transparency: Transparency means governments must be open about all information and decision-making processes related to rights. People must be able to understand how major decisions affecting their rights are made and how public institutions responsible for implementing rights are managed.

> Non-Discrimination: Human rights must be guaranteed without discrimination of any kind. This includes not only purposeful discrimination, but also protection from policies and practices which may have a discriminatory effect.

Each of these principles should be applied in the development of a multi-stakeholder response to the threats to democracy posed by digital media.


2019_354.jpgGILE AND RE SP ONSIVE APPR O A CH

T O P OLIC Y

Finally, in the absence of a strong evidence base, it makes sense to take an agile, iterative approach to policy change. Experiment with all the policies all the time. Ensure that

the funding, design, and implementation of policies reflect a record, learn, and adapt approach to measure the impact of any new initiatives or regulations, and to make adjustments as evidence becomes available as to impact.

2019_355.jpg

DIGIT AL THRE A TS T O DE MO CR A CY

UR GENT AREAS FOR A CTION

2019_324.jpg

Some of the areas in which action is needed include efforts to:


> Restore a genuinely multi-stakeholder approach to internet governance, including rebalancing power through meaningful mechanisms for collective engagement by citizens/users;

> Refresh antitrust & competition regulation, taxation regimes and related enforcement mechanisms to align them across like-minded liberal democracies and restore competitive fairness, with a particular focus on public interest media;

> Recommit to publicly funded democratic infrastructure including public interest media and the creation, selection and use of online platforms that afford citizen participation and deliberation.;

> Regulate for greater transparency and accountability from the platforms including algorithmic transparency and great accountability for verifying the sources of political advertising;

> Revisit regulation of privacy and data protection to better protect indigenous rights to data sovereignty and redress the failures of a consent-based approach to data management; and

> Recalibrate policies and protections to address not only individual rights and privacy but also to collective impact wellbeing. Policies designed to protect people online need to have indigenous thinking at their centre and should also ensure that all public agencies responsible for protecting democracy and human rights online reflect, in their leadership and approaches, the increasing diversity of our country.

In the wake of the Christchurch mosque attacks, a new global momentum has emerged around the role that social media has played in the spread of violent extremism and terrorism, and what can be done to stop it. The New Zealand government has, rightly, stepped up to play a leadership role in that work.

What we need right now is a clear analysis of the wider structural and systemic issues that underpin the immediate moderation challenge and a solid proposal of regulatory and other changes that are needed to tackle those bigger issues. That means ensuring that the current, heightened public debate on the role of digital media in fostering and spreading hate is placed into a wider context of the regulatory and structural changes needed to revive and restore the key features of a healthy and peaceful democracy in our country, and around the world.

Our intention is that this research will help frame, inform and support those efforts.


S U M M A R Y

O F K E Y F I N D I N G S F R O M E A C H

P A R T O F T H E R E S E A R C H


KE Y FINDING S

FR O M THE SUR VE Y

2019_356.jpg

Some key findings from the analysis of the survey data:


> Use of social media is high, and Facebook dominates with Twitter and Instagram also used.

> Around a quarter of the sample used social media to engage with ‘political issues” or politicians.

> Social media platforms are used for political activity by minority ethnic groups more than Pākehā. Indicating their potential use as a tool for engagement in formal democratic system.

> Stated trust in news online may be low, but perceptions of information credibility are driven by trust in friends, family and organisations.

> Most people still rely on mainstream media for information about a key political issue (decriminalisation of marijuana) but friends and family and online news feature strongly.

New Zealanders are relatively accurately informed about the opinions of others with regard to the decriminalisation of marijuana.

There is evidence that New Zealanders who believe their views are in the minority on decriminalisation of marijuana are less willing to share their views both offline and online. This suggests social media platforms replicate rather than overcome existing barriers to engaging in less formal processes of democracy (public discussion and discourse) for people who hold minority views.

DIGIT AL THRE A TS T O DE MO CR A CY

KE Y FINDING S FR O M THE LITER A T URE

REVIEW

2019_357.jpg

To paraphrase classical historian Mary Beard, western democracy is a 2000 year old experiment. In 2019 the significant technical disruption that is digital media is having a powerful effect on the results. Yet what is the nature of that effect? Does our collective written and published knowledge tell us what benefits and opportunities digital media offers in building a stronger, more inclusive and participatory democracy? And the threats or risks it poses to it? And what if anything does the empirical evidence tell us optimises the opportunities and reduces the risks to our democracy from digital media?

The answer to these questions remains elusive. While our literature review was not exhaustive, this research confirms that there is, at present, a troubling dearth of scientific, empirical, evidence-based research that tests or aims to validate “workable solutions” to the seven key threats to democracy we’ve identified in this project.

While some empirical evidence exists, notably in the area of designing new platforms and affordances with pro-social intent, the significant majority of the research relating to the threats we identified is based on expert opinion and normative approaches. Meaning, it presents theoretically sound arguments about the way things “ought to be” if democracy is to be “reclaimed” from incivility and a rogue form of capitalism in the digital age.

In the expert opinion literature the following four themes were identified:


  1. Policy / Legal Solutions

For example, adapt existing legislation; create new legislation; institute new oversight bodies or inter-government agencies; or to improve regulations on content moderation.

  1. More Corporate Transparency

Currently the lack of transparency around moderation practices presents challenges to accountability, governance, and the ability to apply public and legal pressure. Expanding empirical research to improve moderation processes requires private intermediaries to make these processes and practices accessible to researchers.

  1. Better Design

Platform design can influence the way individuals, organisations and institutions make decisions around platform uses/objectives. Pro-social and democratic values must be encoded into the infrastructure of the internet, including algorithms. At present, the normative values embedded into these global private intermediaries – e.g., openness, connectedness, free speech, etc. – are not culture-neutral norms. It the first step towards designing more deliberative spaces, pro-social tools and online environments.

  1. Improve Content Moderation

Calls range from the standardisation of industry-wide “best practices” to more transparency and researcher access. These actions would require greater corporate transparency, corporate grievance mechanisms that are transparent, accessible and in accordance with international human rights law, and multi- stakeholder, and inclusive governance approach, and content moderation should become an organisational priority rather than department silo.

This absence of tested solutions is not evidence that proposals do not work, but that they are untested. This leads us to conclude there is a critical need for investment in more research. People in government, civil society, NGOs, and private enterprise need to commit to researchers and projects who will do pre-and post-testing of solutions that stakeholders are recommending.

Such research will not only measure effect and enable us to extend what’s working to other places or contexts, but ensure future normative prescriptions are informed by evidence beyond the anecdotal (or budgeting restrictions).

It is critical that people in the New Zealand government especially measure whether or not what is being done is working to build a more inclusive and participatory democracy. New Zealand would break significant ground in that regard.

When people in government and civil society seek recommendations for solutions, they need to mitigate the risk that experts reproduce “solutions” that fit the professional discourses in which they’re embedded. To do this, it is important that people in government ask multi-stakeholder group participants:


> What if any evidence do they have for the suggestions made?

> What experiences inform these recommendations and why do they identify them as workable solutions over others?

> How do they imagine testing their effectiveness?

> Given the current lack of evidence, it is critical that the values, experiences, and outcomes that underlie recommendations are made transparent and visible.

KE Y FINDING S FR O M THE INTER VIEWS

2019_357.jpg

One of the questions we posed in the interviews was what had changed in the landscape of democracy, through the influence of digital media, and what has not changed.


A F AMILIAR BUT CHANGING L ANDSCAPE

Many things have not changed, participants told us. Misinformation, disinformation and harassment are not new. Outrage, political polarisation and extremism are not new. Filter bubbles, soundbite politics and data capture - none of this is new. Even the erosion of the authority of published material isn’t new. And perhaps most tellingly, the cultural hegemony of tech isn’t new. Some participants argued that despite all that has changed as a result of digital media, the replication of existing power structures in the governance and management of the tech giants has inevitably lead to reinforcing many already entrenched power imbalances. Further, they said, the lack of diversity at the governance and senior management level prevented these companies from identifying and responded adequately to the risks and threats inherent in their platform designs.

So what has changed? While recognising that the foundations of mis- and disinformation, online harassment and abuse, polarisation and extremism all existed well before the

rise of digital media, most interviewees nonetheless saw particular ways in which the

DIGIT AL THRE A TS T O DE MO CR A CY

features and functions of digital media has changed the scale, intensity and reach of those phenomena. Digital media has changed the scale, speed and breadth at which information can be shared. It has allowed advertisers, including political advertisers, to target people with much greater precision. Digital media has generated new levels of distraction, undermining citizens capacity to engage in the complex thinking demanded in a democracy. Data has taken on a new value, and has been gathered and used at an unprecedented scale. And finally, but again, perhaps most importantly, a very small

number of very large companies control the means of communication used by the majority of people in most democracies on the planet.

So given what has changed with the rise and digital media, what has stayed the same, and the structural underpinnings of the major digital platforms - where are the biggest

opportunities for democracy? The obvious and most commonly cited opportunities were in the democratisation of information, increased increased diversity in public discourse, more public engagement with government and democratic process, and in increased transparency and openness in government.

On the other hand, participants described considerable risks and threats to democracy including digital exclusion. The most commonly cited risks were the impact of digital monopolies, lack of competition and their impact on public-interest media and misinformation and disinformation, including deepfakes and the consequent erosion of trust in information. Other commonly cited risks include political manipulation including foreign interference, cybersecurity of government and security of elections, and the more common manipulation through political advertising, and related risks of polarisation, radicalisation and ‘echo chambers’. Other significant risks highlighted by participants were the impact on democracy of online abuse and hate, disengagement, distractions and attention hijacking, and loss of privacy and consent fatigue. Woven throughout many interviews was a recognition that a lack of transparency and accountability by the big platform companies underpinned and exacerbated all of these risks.

As one participant put it, overall, the picture of how democracy as a form is evolving under the influence of digital media is ‘quite messy’. “[I]t’s got all these new ways to participate, all these new channels for participation. At the same time, it’s getting harder to curate and access that content online, and also critique it. So it’s a messy space to talk about risks and opportunities, because the whole landscape is so complicated and moving.”

SOL UTIONS

Interviewees suggested a range of interventions and solutions to both maximise the opportunities for democracy presented by digital media and minimise the threats. These range from interventions at the structural and systemic level through to suggestions for individual behavioural change.

Some of the areas in which action was identified as being most urgent include effort to:


> Restore a genuinely multi-stakeholder approach to internet governance, including rebalancing power through meaningful mechanisms for collective engagement by citizens/users;

> Refresh antitrust & competition regulation, taxation regimes and related enforcement mechanisms to align them across like-minded liberal democracies and restore competitive fairness, with a particular focus on public interest media;

> Recommit to publicly funded democratic infrastructure including public interest media and the creation, selection and use of online platforms that afford citizen participation and deliberation.;

> Regulate for greater transparency and accountability from the platforms including algorithmic transparency and great accountability for verifying the sources of political advertising;

> Revisit regulation of privacy and data protection to better protect indigenous rights to data sovereignty and redress the failures of a consent-based approach to data management; and

> Recalibrate policies and protections to address not only individual rights and privacy but also to collective dynamics and wellbeing, and protect indigenous rights. Public agencies responsible for protecting democracy and human rights online should reflect, in their leadership and approaches, the increasing diversity of our country.


64

DIGIT AL THRE A TS T O DE MO CR A CY


A P P E N D I X 1 :

L I T E R A T U R E

R E V I E W P A R T 1

D I G I T A L T H R E A T S T O D E M O C R A C Y > >



>
>
>
>



>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
L
I
T
E
R
A
T
U
R
E
>
>
>
>
R
E
V
I
E
W
>
>
>
>
>
>
>
>
>
P
A
R
T
>
1
:
>
>
>
>
>
T
H
R
E
A
T
S
>
>
>
>
>
>
>
A
N
D
>
>
>
>
>
>
O
P
P
O
R
T
U
N
I
T
I
E
S
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>

> > Le on S alter, Kathle en Kuehn, Je ss B erentson-Shaw & M arianne Elliott

> > > > >

2019_358.jpg

2019_359.jpg

> >

2019_315.jpg



D I
G I T
A
L

T
H R E
A
T
S

T O



D E
M O C
R
A
C Y

This report is part of the Digital Threats to Democracy research project.

To see the rest of the reports and the overall findings go to digitaldemocracy.nz

ISBN: 978-0-473-48026-4

DIGIT AL THRE A TS T O DE MO CR A CY

CONTENTS INTRODUCTION 69

SPECIFIC OPPORTUNITIES FOR IMPROVING DEMOCRATIC
PARTICIPATION THROUGH DIGITAL MEDIA 71

2.1 – Democratization of information publishing 72
2.2 – Broadening the public sphere 72
2.3 – Increasing equality of access to and participation within political processes 73
2.4 – Increasing participation and engagement in political processes 74
2.5 – Increasing transparency and accountability from government 75
2.6 – Promotion of democratic values 76

CURRENT THREATS/BARRIERS TO ACHIEVING OPPORTUNITIES 77

3.1 – Increasing power of private platforms 78
3.2 – Foreign government interference in democratic processes 79
3.3 – Surveillance and data protection 79
3.4 – Fake news/disinformation 80
3.5 – Filter bubbles/echo chambers 81
3.6 – Hate speech and trolling 82
3.7 – Distrust/dissatisfaction with democracy 83

CONCLUSION 84
REFERENCES 85


I N T R O D U C T I O N

From the early development of digital media, and in the wake of large scale democratic action, including the Occupy movement and the Arab Spring, there was optimistic academic consensus on the capacity of digital media to increase democratic participation. However, the election of Donald Trump and the Brexit referendum, have shaken

the foundations of Western democracies, and turned that optimistic view to significant concerns about the role of digital media in eroding democratic participation. There is now clear evidence of interference by the Russian government in the 2016 US presidential election using digital media strategies, which had the effect of discouraging sections of the public from voting (Persily, 2017). Fake news, filter bubbles, populism, polarisation, hate speech, trolls and bots are firmly embedded into mainstream understanding of digital media. All this is in a context where election turnout and trust in government institutions were already in a general decline in Western nations (Leininger, 2015; OECD, 2017).

Rather than the increasingly widespread adoption of digital media necessarily leading to a pattern of increased participation, diversity of opinion, and the empowerment of marginalised groups, digital media (in particular social media through its algorithmically calculated news feeds), can work to create opinion silos, or “echo chambers”, which can “limit the possibility of understanding differences and increase the likelihood of intolerance and hostility” (Lu & Yu, 2018, p. 3).

DIGIT AL THRE A TS T O DE MO CR A CY

In this narrative literature review we sought to describe, from the recent literature, what the nature of the opportunities and threats are to democracy from developments in digital media.

We asked two research questions:


  1. What are the specific opportunities digital media presents for improving democratic participation?
  2. What are the current threats/barriers that are in place to prevent achieving those opportunities?

In total, 110 documents were reviewed (including journal articles, reports and book chapters), with 69 of those containing evidence to support one or more of the research questions (see reference list).

A non-systematic narrative review was chosen with a view to summarising the themes that have been covered in terms of opportunities and problems (risks and threats). Searches were limited to research published in the last eight years (most are within five).


S P E C I F I C

O P P O R T U N I T I E S

F O R I M P R O V I N G D E M O C R A T I C

P A R T I C I P A T I O N T H R O U G H

D I G I T A L M E D I A

This section outlines six specific opportunities provided by digital media for improving democratic participation discovered in the literature. These are: the democratisation of information publishing, the broadening of the public sphere, the increasing equality of access to and participation within political processes, increasing transparency and accountability from government and the promotion of democratic values.

2 . 1

DEM O CR A TIZ A TION OF INFORMA TION

PUBLISHING

2019_325.jpg

This is the capacity of digital media to enable “anyone to create content and share it with a global audience” (College of St George, 2018, p. 1). As well as being identified in the background paper to a consultation currently being undertaken by the UK organisation St College of St George, this capacity has been noted by the influential sociologist Manuel Castells (2013). Castells termed it “the shift of mass communication to mass self- communication” (p. 23), whereby large media corporations and governments no longer dominate the production of messages and content to the same degree as the majority of the 20th Century. Whereas in the age of mass-communication the ability to generate content was limited by access to costly printing facilities, TV studios etc., in the age of mass self-communication such entry barriers have been reduced to simply owning a laptop or a mobile phone.

This has the potential to improve democratic participation by facilitating dialogue both between governments and citizens (improving institutional trust) and between otherwise

divergent groups and individuals in society.


2019_349.jpg

2 . 2

BR O ADENING THE PUBLIC SPHERE

2019_340.jpg

Linked to 2.1, digital media has the capacity to widen policy conversations to include “previously marginalized individuals and communities” (College of St George, 2018, p. 1), who formerly would have been excluded from democratic processes. A good example of this is the @IndigenousX Twitter account (explored in more detail in section 4.3), which provides a platform for the articulation of indigenous Australian culture and perspectives, “bringing their views and concerns to a wider audience” (Sweet, Pearson, & Dudgeon, 2013, p. 109) than would have been possible before the widespread adoption of digital media.

In a study of the US context, Auger (2013) found that social media increased the opportunities for NGOs to express their perspectives, meaning a larger “marketplace of ideas” has been able to take shape than previously. Moreover, social media meant that it was easier for non-mainstream ideas to become legitimised, which was linked to the securing of funds for the NGOs’ activities. Further, by analysing 235 NGO social media

posts, Auger (2013) found that “rational appeals were the most frequent type of advocacy characteristics used” (p. 373), despite the issue studied being the highly contentious

one of gun control in the US, which appears to counter the more recent emphasis on social media filter bubbles (see section 3.2). The study identified 274 different message characteristics from that corpus, assigned by the purpose, content and emphasis of posts from four different NGOs, including the National Right to Life Committee and the National Rifle Association. Only 17% of the “message appearances identified” (p. 373) fitted the study’s definition of propaganda, defined by the four features of reducing complex issues, use of authority figures, emphasis on conflict rather than cooperation and reduction of complex issues to cause and effect.

2 . 3

INCREASING

EQUALIT Y OF

A CCE SS T O AND P AR TICIP A TION

WITHIN P OLITICAL PR O CE SSE S

2019_360.jpg

Several studies reviewed found that digital media had the potential to increase equality of access to and participation within political processes, in terms of gender, class, race and age.

In a study of the Israeli 2015 parliamentary election, Yarchi and Samuel-Azran (2018) found that Facebook afforded more positive exposure to women politicians than traditional news media. The authors found that “female politicians’ posts generated significantly more user engagement in terms of the number of Likes and Shares

in comparison to male politicians” (p.978), creating a supportive communicative environment which boosted their self-esteem.

Two studies (Dubow, 2017; see Government Information Services, 2018, for the New Zealand context) found evidence through interviews with experts that digital media, when incorporated with other good government policies (such as civics education), has the potential to build well-networked, educated and empowered communities, which previously have been economically and socially marginalised by digital divides. Dubow (2017) recommended the development of new digital tools focused on breaking down and summarising civic information, while Government Information Services (2018) recommended tools which allowed different levels of participation, increasing inclusivity across genders, ethnicities and ages.

In terms of race, Jakubowicz et al. (2017) found that, through the examination of several case-studies in the Australian context, digital media can facilitate the formation of both ‘ad-hoc’ and longer-term, group-based online communities focused on fighting racism, which can provide a safe space of belonging for ethnic minority groups. This sense of community encourages engagement and participation in public discourse such as the campaign to change the date of Australia Day, which could be otherwise discouraged

by online hate-speech and more mainstream forms of racist discourses (see sections 3.3 and 4.3 for further details).

A survey undertaken by the UK Think Tank Demos (Miller, 2016) found that social media and other digital media forms increased participation and engagement in the British 2015 elections by young people, an age group which is believed to have become

increasingly disengaged from political processes. Similar results were derived by Xenos, Vromen, and Loader (2014), who, in a comparative study of Australia, USA and UK, found that social media can “soften traditional patterns of political inequality” (p.152), by encouraging political engagement from 16-29 year olds.

DIGIT AL THRE A TS T O DE MO CR A CY

2 . 4

INCREASING

P AR TICIP A TION

AND ENG A GEMENT IN P OLITICAL

PR O CE SSE S

2019_319.jpg

Numerous studies found links between digital media and increased participation and engagement in political processes in the general population, not only in marginalised groups. This includes engagement in elections, different forms of deliberative democracy, as well as participation in more informal political action such as protests.

In terms of voting, the above mentioned Demos study (Miller, 2016) found “39 per cent of poll respondents who had engaged with political content on social media felt more likely to vote as a direct result” (p. 11). Such positive results can be partly attributed

to the capacity of platforms such as Twitter to provide highly interactive, temporary political discussion fora through the hashtag function, which are easy to engage with, and accessible without high degrees of technical or political knowledge. For example, Barack Obama’s 2015 State of the Union address “spurred approximately 2.6 million tweets” (Gayo-Avello, 2015, p. 10) via the #SOTU hashtag.

In terms of participative democracy, a study of the public review of the Icelandic constitution, (Valtysson, 2013) found that the use of Facebook, Twitter, Flickr, and YouTube both increased engagement and facilitated the emergence of “networked publics” which promoted consensual opinion formation, despite lacking formal decision- making authority. In another study on the Iceland context, but in the area of local government, Simon, Bass, Boelman, and Mulgan (2017) found that the Better Reykjavik (idea generation) and Better Neighbourhoods (participatory budgeting) platforms

saw 70,000 citizens taking part out of a total city population of just 120,000. The active participation of so many people would be extremely difficult to organise without digital media platforms.

The same research (a set of case studies) also looked at France’s Parlement et Citoyens, a “website which brings together representatives and citizens to discuss policy issues and collaboratively draft legislation” (p. 24). This initiative aims to move beyond consultation towards citizens “inform[ing] and shap[ing] legislation which is put before Parliament” (p. 24). Survey evidence from Switzerland (Kern, 2017), which has a high number of binding referendums, is that the availability of such systems of direct democracy increases feelings of having influence over the system, thereby increasing the likelihood of participation in formal democratic processes. Kern (2018), through a combination of quantitative surveys and qualitative, semi-structured interview data from Belgium, also found links between participation in a single referendum (the most common form of direct democracy) and intention to participate in future political protests.

Additionally, in the cases of the Arab Spring, Los Indignados and Occupy movements, Twitter was been found to have played a key role in the organisation of those large-scale protests (Bennett, Segerberg, & Walker, 2014; Papacharissi & de Fatima Oliveira, 2012). Particularly in the case of the Los Indignados movement in Spain (Bennett & Segerberg, 2012), digital media platforms were found to be “taking the role of established political organizations” (p. 742) such as political parties and unions, which were regarded as corrupt.

2 . 5 INCREASING

TR ANSP ARENC Y AND A CCOUNT ABILIT Y

FR O M GO VERNMENT

2019_318.jpg

A substantial amount of recent research was found on “open government” and

“e-government” initiatives, with an influential OECD (2017) report highlighting their importance in rebuilding public trust in democratic institutions and policies. The general consensus is that governments which make data on the transactions of government departments available to their citizens via open government portals, allow citizens to

see that taxpayer funds are being spent appropriately and fair decisions being made in a transparent manner, thereby increasing trust (see Kim & Lee, 2012; Nielsen, 2017; Wu, Ma, & Yu, 2017).

Survey research across 36 major cities in China finds that such moves towards transparency are particularly effective when overall trust in government institutions is low (Wu et al., 2017). However, a separate analysis of seven Western open-government portals finds that the “ordinary citizens” perceives terms such as “open government” and “e-government” as vague -- confusion that correlates to large differences in accessibility levels (Lourenço, 2015).

In a large quantitative study of survey data from 36 major cities in China, Wu et al. (2017) found that such moves towards transparency are particularly effective when overall trust in government institutions is low. Perceptions of equality of public service provision is also “substantially strengthened [by open government initiatives] when government trust is low (p. 898). However, Lourenço (2015) noted, through an analysis of seven Western open-government portals from the perspective of the “ordinary citizen”, that terms such as “open government” and “e-government” can be vague, allowing for large

differences in levels of accessibility. Hence, if such websites do seek to enable the holding of government to account by the citizenry, they need to do more than merely dump raw data, they need to structure websites and data so that the non-data expert can use them (more on this in section 4.7).

DIGIT AL THRE A TS T O DE MO CR A CY

2 . 6

PR O M O TION OF

DEM O CR A TIC V AL UE S

2019_361.jpg

The sixth and final opportunity outlined in the literature is the use and regulation of digital media by governments to actively promote democratic values, informed debate, tolerance and respect for other groups. For example, this can be done directly through government funding of public service journalism, as advocated for by the Organization

for Security and Co-operation in Europe (OSCE, 2017). A second direct form of promotion can be through the funding of independent statutory organisations such as All Together Now in Australia, which is “focused generally on encouraging the embracement of cultural diversity and the eradication of racism” (Jakubowicz et al., 2017, p. 242). The organisation runs the anti-racism Twitter account @itstopswithme, together with the #itstopswithme hashtag which encourages citizen interactive engagement in campaigning.

Less direct forms are the general educative effects of participating in democratic processes, the scope and breadth of which have been shown in this section to be potentially both widened and enlarged by digital media. Michels (2011) conducted a meta-analysis on the effects of citizen participation in democratic processes, collecting “empirical evidence about effects from 120 cases in different Western countries” (p.

276). Overall findings were that government programs which promote participatory democracy have “a positive effect on the development of knowledge, skills, and virtues [which includes] active participation in public life, trustworthiness, and reciprocity (giving and taking)” (p. 278). This provides support to the OECD’s (2017, p. 118) argument that, together with short-term, functional benefits, there are also intrinsic, long-term benefits to government support for digital media-enabled participatory democracy, and will be discussed in more detail in section 4.

While the literature highlights many opportunities offered by digital media to improving democratic engagement, there are many threats also. The following section covers those revealed by the published literature.


C U R R E N T

T H R E A T S / B A R R I E R S T O A C H I E V I N G

O P P O R T U N I T I E S

This section will define and outline the emergent threats or barriers to increasing democratic participation through digital media. Seven key threats have been identified from the literature (there is overlap between them), which are linked to the increasing influence of digital media in society (though this is not the only cause). These threats range from issues at the structural and systems level e.g interference by governments, through to threats from individual responses, eg distrust. Together, they threaten to derail the optimistic promise of digital media through the realisation of the specific opportunities outlined in section two.

These threats have been identified as:


  1. increasing power of private platforms,
  2. foreign government interference in democratic processes,
  3. fake news (also known as misinformation and disinformation),
  4. filter bubbles (also known as echo chambers),
  5. surveillance and data protection,
  6. hate speech and trolling, and
  7. a growing distrust of or dissatisfaction with democracy.

DIGIT AL THRE A TS T O DE MO CR A CY

3 . 1

INCREASING

P O WER OF PRIV A TE PL A TFORMS

2019_350.jpg

Private platforms have increasing power to determine all aspects of our information lives, social interactions, and democratic activities. It underpins and flows back from most of the other threats we discuss.

Sections of the reviewed scholarship highlighted the increasing dominance of an increasingly small number of privately-owned platforms over the internet (see Fuchs, 2014). Google and Facebook dominate the digital advertising market, both in the US and in New Zealand (Myllylahti, 2018; Srnicek, 2017). The two companies “drive 53 percent of [New Zealand] news websites’ traffic” (Myllylahti, 2018, p. 6), but without contributing a corresponding volume of advertising revenue, thereby threatening journalism’s economic foundations, with serious repercussions for the breadth and quality of the public sphere (opportunity 2.2).

Srnicek (2017) highlights a monopolisation tendency “built into the DNA of platforms” such as Google and Facebook, linked to the close correlation between the mining of user-data and the ability of these companies to make profits. Such is the value of data in what Srnicek terms the era of “platform capitalism”, Google and Facebook are rapidly

purchasing smaller companies so that they are able to control the extraction, processing and analysis of such data, thereby setting the rules of the game, making it increasingly harder for competitors to enter the market.

Further, it becomes increasingly difficult for companies that offer alternatives to the data-extraction for profit model, such as Loomio (discussed further in Part 2), to survive in this environment (Jackson & Kuehn, 2016). Loomio is a deliberative democracy tool intentionally organised around the principles of open source (user control over source code) and co-operative, democratised ownership and decision-making. However, because it lacks the resources to sustain huge servers or cloud services required for the large amounts of data necessary for the functioning of its platform, it must lease these services from the big platforms. Hence, “it must sacrifice some aspects of control for economic reasons” (p. 424).

At the level of individual personality, our lives are led more and more through these platforms, meaning that they increasingly shape our social worlds. As put by the Internet Governance Forum (IGF, 2015), “increasingly, the operation of these platforms affects individuals’ ability to develop their own personality and engage in a substantial amount of social interactions” (p. 1). Hence, the actions of these companies can impact human rights (see also OSCE, 2017), not only through their control of personal data. Because their algorithms dictate what appears and what does not in the public sphere, their algorithms could be seen as a form of censorship. Complicating this further is that human rights protections are normally applied to national governments, rather than private companies.

3 . 2

FOREIGN

GO VERNMENT

INTERFERENCE

IN DEM O CR A TIC PR O CE SSE S

2019_362.jpg

A recent US intelligence report “claimed with a high degree of confidence” (Ziegler, 2018, p. 567) that “Russian President Vladimir Putin ordered an influence campaign in 2016 aimed at the US presidential election” (Intelligence Community Assessment quoted in Ziegler, 2018, p. 567). This included a deliberate strategy to use social media “to undermine confidence in the election and to magnify stories critical of Hillary Clinton”

(Persily, 2017, pp. 70-71). Teams of trolls were employed in order to post negative political advertising stories online (Persily, 2017), and damaging emails were distributed through Wikileaks (Ziegler, 2018). Persily (2017) argues that the negative advertising contributed to decreased turnout and voter disengagement.

As noted by Ziegler (2018), the US has not been the only target of Russia’s military intelligence unit the GRU, with German and French elections also targeted by disinformation campaigns during 2017. Ziegler (2018) argues that such tactics should be placed in the context of a broader strategy of “hybrid warfare”, where Russia seeks to exaggerate already existing tensions and polarisations by encouraging a lack of faith in the electoral system and trust in the idea of liberal democracy.

Recently, security services in New Zealand, has revealed that New Zealand has been the target of attempts to interfere in democracy through a “range of vectors” (Moir, 2019). The threat to free and fair elections, and more general to liberal participatory and inclusive

democracy through the manipulation of digital media is a well-established one.


2019_363.jpg. 3

SUR VEILL ANCE AND D A T A PR O TECTION

2019_325.jpg

The 2013 exposé by Edward Snowden and the 2018 Cambridge Analytica revelations have brought the issues of data privacy and surveillance into the public eye. The former revealed that the major internet service providers were sharing the data of their customers with US government agencies such as the NSA. Further, Snowden revealed that this data collection was also being done in the other member countries of the ‘Five

Eyes’ (the UK, Australia, Canada and New Zealand) in order for governments to carry out mass surveillance on their citizens (Fuchs & Trottier, 2017).

The Cambridge Analytica revelations highlighted that “Facebook gave unfettered and unauthorized access to personally identifiable information...of more than 87 million unsuspecting Facebook users to the data firm” (Isaak & Hanna, 2018, p. 56). The scandal brought to the surface the underlying mechanics of the attention economy, outlined in section 3.1. Put simply, the consumer of social media is also the product – their personal data is the oil that greases the machine, or as Ghosh and Scott (2018) put it, “behavior tracking and the business of online advertising is central to the market power of global internet platforms” (p. 6).

These revelations are having effects on the perceptions of internet users: surveys in the UK have revealed deep concerns about the such practices (Fuchs & Trottier, 2017), while a recent survey by Internet NZ revealed data security and privacy was one of the top 5 concerns of New Zealand users (InternetNZ, 2017b). In the US, surveys of writers (PEN America, 2013) and investigative journalists (Holcomb, Mitchell, & Purcell, 2015) have revealed a worrying “chilling effect”, similar to the spiral of silence mentioned earlier, which demonstrates significant curtailing effects on the private and public practices

of both professions, which are vital for the sustainment of a healthy public sphere (opportunity 2.2). Evidence from the UK indicates that ongoing surveillance of Muslim communities contributes to feelings of alienation from mainstream society (Blackwood, Hopkins, & Reicher, 2013), thereby having a detrimental effect on opportunity 2.3. In the wake of the Christchurch terror attacks in March 2019, in which 50 muslim New Zealanders were murdered by a white supremist, surveillance of the muslim New Zealand community by government agencies through digital means is also being highlighted as a key threat to

democracy (Human Rights Foundation of Aotearoa New Zealand, 2016).

DIGIT AL THRE A TS T O DE MO CR A CY

3 . 4

F AKE NEWS/

DISINFORMA TION

2019_364.jpg

Probably the most famous (or infamous) of the identified threats to democratic participation, due to its links to the current US president, the term “fake news” rapidly went “from being marginal to near ubiquitous” (Farkas & Schou, 2018, p. 304) within news media discourse in the immediate lead up to and the aftermath of the November 2016 election

in the US. Linked to this emergence within a highly politicised context, the term has become a tool for the delegitimization of political opponents, signalling a broader “clash of narratives” (Marda & Milan, 2018, p. 3) between conservatism and liberalism in the US.

The phenomenon could also be labelled propaganda, but as this is also a highly loaded term, disinformation is most suited to our purposes here. Disinformation is distinguished from misinformation, with the latter lacking a deliberate intention. The former, by contrast, is defined as “false or misleading information that is deliberately disseminated to deceive a target audience” (College of St George, 2018, p. 2). As well as deliberately misleading content, disinformation can be disingenuous concerning its “origins and

affiliations...[often undertaking] concerted efforts to mask these origins” (FireEye, 2018, p. 5). Because it is defined by intent, disinformation can become misinformation when it is unintentionally spread by human interaction online (Vosoughi, Roy, & Aral, 2018).

Ghosh and Scott (2018) offer a yet more precise term in “political disinformation”, defined as “highly targeted political communications that reach millions of people with customized messages that are invisible to the broader public” (p. 3). Ghosh and Scott (2018) thereby link the phenomenon directly to what has been commonly termed the “attention economy”, or “the financial interests that drive the core technologies of the leading internet platforms” (p. 4). Polarising political posts (whether true or not) evoke the strongest emotions, and therefore hold attention, “which in turn generates [advertising] revenue” (p. 4). As has become clear with the recent Cambridge Analytica revelations (Isaak & Hanna, 2018), targeted political advertising is a highly profitable business.

Not only does it hold attention, but political disinformation spreads faster around the internet through likes, shares and retweets. Vosoughi et al. (2018) found that “false political news... [not only] diffused significantly farther, faster, deeper, and more broadly than the truth” (p.1), but also other highly viral types of news such as reports on terrorism. Guo, Rohde, and Wu (2018), found that this had a direct impact on the 2016 election, as “fake news sites...were mainly responsible for spreading negative news about [Hilary] Clinton” (p.14), rather than Donald Trump.

3 . 5

FIL TER BUBBLE S/ ECHO CHAMBERS

2019_322.jpg

While filter bubbles and echo chambers are terms which are often used interchangeably, the former tends to refer to a specific technical effect of the attention economy, while the latter has more of a broader social and psychological dimension. The most famous example of a filter bubble is Facebook’s news feed, created by a machine-learning algorithm which draws on data created by user networks, likes and comments. The algorithm (and hence the news feed) can also be influenced by and how much companies and organisations are willing to pay to be present there (hence political disinformation requires filter bubbles to be effective, see Ghosh & Scott, 2018). However, the issue is not restricted to Facebook. Political disinformation sites can also take advantage of Google’s algorithm, by paying “top billing” to appear high up on searches (Ghosh & Scott, 2018).

Hence, central to the profits of the two most powerful internet platforms (Srnicek, 2017), filter bubbles are today so ubiquitous they most often work in the background to our daily lives, shaping the information we receive “imperceptibly and without consent” (College of St George, 2018, p. 5).

Filter bubbles follow a longer-term trajectory within advertising (including political advertising) which has sought to collect data in order to tailor adverts to target groups, however, now they can be targeted to specific individuals (Ghosh & Scott, 2018). This can contribute to the formation of echo chambers, which is the reinforcement of confirmation bias through selective exposure to information (College of St George, 2018; Guo et al., 2018). Hence, the technical and economic drivers of filter bubbles can act to reinforce echo chambers, but the two cannot be reduced to each other, with the latter existing before social media, through for example, the alignment of newspapers to political affiliation (Möller, Trilling, Helberger, & van Es, 2018).

Increasing numbers of automated social media ‘bots’ have also been linked with the spread of political disinformation and thus the reinforcement of echo chambers (Farkas & Schou, 2018; Persily, 2017). A study of Twitter during the 2016 US election between 16 September and 21 October put the number of active bots at around 400,000, which were “responsible for roughly 3.8 million tweets, about one-fifth of the entire conversation” (Bessi & Ferrara, 2016).

However, the individual user is not without agency, with the majority of false stories on Twitter still being spread, and echo chambers still being reinforced, by humans, rather than bots (Vosoughi et al., 2018). Closely linked to the reinforcement of echo chambers on Twitter is the follow function, whereby users are encouraged to follow other users who confirm similar ideological views to their own, restricting their exposure to ideologically challenging discourse (Guo et al., 2018; Himelboim, McCreery, & Smith, 2013). From a social psychology perspective, echo chambers act as an identity-securing protection from the epistemological and ontological uncertainties created by the vast amounts of (often conflicting) information available online (Lu & Yu, 2018). Linked to this is the decline

in trust in the ability of traditional news media to provide reliable information (Knight Foundation, 2018).

While new research is contesting the placement of blame for echo chambers solely at the door of social media (Beam, Hutchens, & Hmielowski, 2018), there is little doubt that filter bubbles have “exacerbated political divisions and polarization” (Deb, Donohue, & Glaisyer, 2017, p. 4). This fracturing effect has negative implications for the mechanisms of liberal democracy, as developing a broad consensus around decisions made in the public good becomes increasingly difficult (OECD, 2017).

DIGIT AL THRE A TS T O DE MO CR A CY

3 . 6

HA TE SPEECH AND TR OLLING

2019_343.jpg

Linked to the above-mentioned fracturing effect has been the rise of hate speech and trolling. While hate speech obviously predates the internet, trolling is a term linked directly to internet cultures, and until recently, had more playful, less hurtful connotations (Phillips & Milner, 2017). Linked to the hacker breeding ground 4chan, a troll is an anonymous

user who deliberately provokes antagonistic reactions for sheer enjoyment, or “the lulz” (Coleman, 2014). Trolling is aided both by the ease of creating anonymous online profiles (Galán-García, de la Puerta, Gómez, Santos, & Bringas, 2014), and by the atomised nature of internet interaction, both of which can exacerbate certain psychological profiles (Jakubowicz, 2017). A study which tallied a personality survey to one on internet use found strong correlations “between trolling and the Dark Tetrad of personality... sadism, psychopathy, and Machiavellianism” (Buckels, Trapnell, & Paulhus, 2014).

Trolling can pose a direct threat to opportunities 2.2 and 2.3 when it becomes systematically targeted towards minority groups in order to deliberately cause emotional distress, i.e. when it becomes hate speech (Alkiviadou, 2018). While not all hate speech is articulated by trolls, remaining anonymous makes individuals more likely to escape prosecution for the more egregious examples (Holschuh, 2013), as complex and time-

consuming tracking systems have to be employed to trace the perpetrators (Galán-García et al., 2014). As well as

Racialised hate speech (otherwise known as cyber racism) is specifically targeted towards ethnic minority groups, and has become increasingly coordinated in recent years, through the rise of the “alt-right” (Jakubowicz, 2017). It has become a global phenomenon, affecting “refugees and ethnic minorities in Europe, Muslim Blacks and Jews in the United States, Indigenous Australians” (Jakubowicz et al., 2017, p. v) and others. It can have a direct negative impact on opportunities 2.2, 2.3 and 2.4, by encouraging affected groups to retreat to safe locations “where they focus on building intracommunal bonding” (Jakubowicz et al., 2017, p. xi), rather than engaging with national debates and institutions.

Sexualised hate speech is primarily targeted towards women (together with members of the LGBTQI community), and is characterised by its specifically misogynistic nature (Edstrom, 2016). It is often directed towards women in the public eye, or those in influential positions, such as journalists, with proponents directing critical attention onto their supposed essential gender characteristics, rather than their work (Edstrom, 2016). Research undertaken by The Guardian newspaper on its comment threads revealed that out of the ten journalists who had attracted the most hateful comments, eight were women, with the other two being black men (Gardiner et al., 2016). This can

have a negative impact on efforts towards the broadening of the public sphere, as women journalists are discouraged from writing what they may feel are controversial stories.

More generally, a large survey conducted in Hong Kong (Chen, 2018) found a correlation between strong, vocal disagreements with an individual’s perspectives and a “spiral of silence” which acts to curtail the voicing of contentious opinions by minority groups.

Particularly when the polarisation of the public sphere is heightened, fear of social isolation makes it more likely that users “express less disagreeing opinion and exercise more withdrawal behaviors” (p. 3928). The particular ability of trolls and hate speech to fan antagonistic “flames” Ceron and Memoli (2016) rather than promote rational debates, therefore has a direct impact on increasing democratic participation.

3 .7

DIS TR US T /

DISS A TISF A CTION WITH DEM O CR A C Y

2019_319.jpg

This section details a more longer-term, general process which has been ongoing since the 1960s (Ziegler, 2018), but which is intensified by the other threats already outlined in this section. This is increasing distrust and/or dissatisfaction with democratic processes, together with government institutions and politicians. Emblematic of this decline in confidence is a 2016 Pew Research survey which found that trust in the US national government had hit an historic low at 20 percent (Ziegler, 2018). While New Zealanders’ trust in their public services was a little higher at 42 percent the last time the data was collected in 2015 (Stats NZ, 2015), a general, widespread decline in levels of trust in government is acknowledged to be a major issue affecting a majority of the wealthier, Western nations (OECD, 2017).

Deb et al. (2017) contend that “that fundamental principles underlying democracy— trust, informed dialogue, a shared sense of reality, mutual consent, and participation— are being put to the test by certain features and attributes of social media” (p. 3).

These include the aforementioned “echo chambers”, which, when combined with the “proliferation of partisan media in traditional channels, has exacerbated political divisions and polarization” (p. 4).

Lu and Yu (2018) found, drawing on the World Values Survey 2010-14, that the “decentralized, horizontal, interactive mode of information distribution” (p.3) characterised by the internet, increases intolerance of others (although they also found that participation in public deliberation with others from outside their echo chambers increases tolerance – more on this in section 4). More specifically, Ceron and Memoli (2016), drawing on a Eurobarometer survey which collects data from 27 European countries, found that it is the consumption of online forms of news that include opinions which differ from the attitudes of the consumer, which increase “the likelihood of ‘flames’ [a strong polarising effect] that increase skepticism toward democracy” (p.226).

Hence these findings appear to go against the assumption, embedded within the “optimistic” perspective (see Castells, 2013), that increasing access to varied

information and differing opinions automatically facilitates a more open, tolerant and inclusive society.

DIGIT AL THRE A TS T O DE MO CR A CY


C O N C L U S I O N

In this narrative literature review we outlined a cross-section of the recent international literature on the key opportunities for the expansion of digital democracy, and the current threats to actualising those opportunities.

The six key opportunities identified were: the democratisation of information publishing, the broadening of the public sphere, the increasing equality of access to and participation within political processes, increasing transparency and accountability from government and the promotion of democratic values.

These opportunities are at risk however from a significant group of threats, all of which are interconnected to the major structural threat posed by the increasing power of private platforms. The other six threats identified are: foreign government interference in democratic processes, surveillance and data protection, fake news (also known as misinformation and disinformation), filter bubbles (also known as echo chambers), hate speech and trolling, and a growing distrust of or dissatisfaction with democracy.

To ensure that the potential of digital media technologies is realised in relation to digital democracy, it is important to

understand what the research says works to limit or overcome these threats. This is the focus of part two of this literature review.

REFERENCE S


R E F E R E N C E S

Alkiviadou, N. (2018). Hate speech on social media networks: towards a regulatory framework? Information & Communications Technology Law, 1-17. doi:10.1080/1360083 4.2018.1494417

Auger, G. A. (2013). Fostering democracy through social media: Evaluating diametrically opposed nonprofit advocacy organizations’ use of Facebook, Twitter, and YouTube. Public Relations Review, 39(4), 369-376. doi:https://doi.org/10.1016/j.pubrev.2013.07.013

Beam, M. A., Hutchens, M. J., & Hmielowski, J. D. (2018). Facebook news and (de) polarization: reinforcing spirals in the 2016 US election. Information, Communication & Society, 21(7), 940-958. doi:10.1080/1369118X.2018.1444783

Bennett, W. L., & Segerberg, A. (2012). The logic of connective action: Digital media and the personalization of contentious politics. Information, Communication & Society, 15(5), 739-768. doi:10.1080/1369118X.2012.670661

Bennett, W. L., Segerberg, A., & Walker, S. (2014). Organization in the crowd: peer production in large-scale networked protests. Information, Communication & Society, 17(2), 232-260. doi:10.1080/1369118X.2013.870379

Bessi, A., & Ferrara, E. (2016). Social bots distort the 2016 US Presidential election online discussion. First Monday, 21(11). doi:https://doi.org/10.5210/fm.v21i11.7090

Blackwood, L., Hopkins, N., & Reicher, S. (2013). Turning the Analytic Gaze on “Us”: The Role of Authorities in the Alienation of Minorities. European Psychologist, 18(4), 245- 252. doi:10.1027/1016-9040/a000151

Buckels, E. E., Trapnell, P. D., & Paulhus, D. L. (2014). Trolls just want to have fun. Personality and Individual Differences, 67, 97-102. doi:https://doi.org/10.1016/j. paid.2014.01.016

Castells, M. (2013). Communication power (Second ed.). Oxford: Oxford University Press.

Ceron, A., & Memoli, V. (2016). Flames and Debates: Do Social Media Affect Satisfaction with Democracy? Social Indicators Research, 126(1), 225-240. doi:10.1007/s11205-015- 0893-x

Chen, H.-T. (2018). Spiral of silence on social media and the moderating role of disagreement and publicness in the network: Analyzing expressive and withdrawal behaviors. New media & society, 20(10), 3917-3936. doi:10.1177/1461444818763384

Coleman, G. (2014). Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous. London: Verso.

DIGIT AL THRE A TS T O DE MO CR A CY

College of St George. (2018). Democracy in a Post-Truth Information Age: Background Paper. Retrieved from Windsor: https://www.stgeorgeshouse.org/wp-content/ uploads/2017/10/Background-Paper.pdf

Deb, A., Donohue, S., & Glaisyer, T. (2017). Is Social Media a Threat to Democracy? Omidyar Group. https://www.omidyargroup.com/wp-content/uploads/2017/10/Social- Media-and-Democracy-October-5-2017. pdf.

Dubow, T. (2017). Civic Engagement: How Can Digital Technologies Underpin Citizen- Powered Democracy? Retrieved from https://www.rand.org/pubs/conf_proceedings/ CF373.html

Edstrom, M. (2016). The trolls disappear in the light: Swedish experiences of mediated sexualised hate speech in the aftermath of Behring Breivik. International Journal for Crime, Justice and Social Democracy, 5(2), 96-106.

Farkas, J., & Schou, J. (2018). Fake News as a Floating Signifier: Hegemony, Antagonism and the Politics of Falsehood. Javnost - The Public, 25(3), 298-314. doi:10.1080/13183222. 2018.1463047

FireEye. (2018). Suspected Iranian influence operation: Leveraging inauthentic news sites and social media aimed at U.S., U.K., other audiences. Retrieved from https://www. fireeye.com/content/dam/fireeye-www/current-threats/pdfs/rpt-FireEye-Iranian-IO.pdf

Fuchs, C. (2014). Social Media and the Public Sphere. tripleC, 12(1), 57-101.

Fuchs, C., & Trottier, D. (2017). Internet surveillance after Snowden: A critical empirical study of computer experts’ attitudes on commercial and state surveillance of

the Internet and social media post-Edward Snowden. Journal of Information, Communication and Ethics in Society, 15(4), 412-444. doi:10.1108/JICES-01-2016-0004

Galán-García, P., de la Puerta, J. G., Gómez, C. L., Santos, I., & Bringas, P. G. (2014, 2014//). Supervised Machine Learning for the Detection of Troll Profiles in Twitter Social Network: Application to a Real Case of Cyberbullying. Paper presented at the International Joint Conference SOCO’13-CISIS’13-ICEUTE’13, Cham.

Gardiner, B., Mansfield, M., Anderson, I., Holder, J., Louter, D., & Ulmanu, M. (2016). The dark side of Guardian comments. The Guardian. Retrieved from https://www. theguardian.com/technology/2016/apr/12/the-dark-side-of-guardian-comments

Gayo-Avello, D. (2015). Social Media, Democracy, and Democratization. IEEE MultiMedia, 22(2), 10-16. doi:10.1109/MMUL.2015.47

Ghosh, D., & Scott, B. (2018). #digitaldeceit: The Technologies Behind Precision Propaganda on the Internet. Retrieved from https://www.newamerica.org/public- interest-technology/policy-papers/digitaldeceit/

Government Information Services. (2018). How digital can support participation in government. Retrieved from Wellington: https://www.digital.govt.nz/standards-

and-guidance/engagement/online-engagement/research-how-digital-can-support- participation-in-government/

Guo, L., Rohde, J. A., & Wu, H. D. (2018). Who is responsible for Twitter’s echo chamber problem? Evidence from 2016 U.S. election networks. Information, Communication & Society, 1-18. doi:10.1080/1369118X.2018.1499793

Himelboim, I., McCreery, S., & Smith, M. (2013). Birds of a Feather Tweet Together: Integrating Network and Content Analyses to Examine Cross-Ideology Exposure on Twitter. Journal of Computer-Mediated Communication, 18(2), 40-60. doi:10.1111/ jcc4.12001

Holcomb, J., Mitchell, A., & Purcell, K. (2015). Investigative journalists and digital security. Pew Research Center. Retrieved from http://www.journalism. org/2015/02/05/ investigative-journalists-and-digital-security

Holschuh, J. (2013). #CivilRightsCybertorts: Utilizing Torts to Combat Hate Speech in Online Social Media. University of Cincinnati Law Review, 82, 953.

Human Rights Foundation of Aotearoa New Zealand. (2016). And you think maybe this is not home. Human Rights Foundation of Aotearoa New Zealand. Retrieved from https:// humanrightsfoundation.files.wordpress.com/2019/03/and-you-think-maybe-this-is- not-home-final.pdf

IGF. (2015). Recommendations on terms of service and human rights. Retrieved from https://www.intgovforum.org/cms/documents/igf-meeting/igf-2016/830-dcpr-2015- output-document-1/file

InternetNZ. (2017b). What kiwis think of the Internet in New Zealand. Retrieved from https://internetnz.nz/2017-internet-research

Isaak, J., & Hanna, M. J. (2018). User Data Privacy: Facebook, Cambridge Analytica, and Privacy Protection. Computer (8), 56. doi:10.1109/MC.2018.3191268

Jackson, S. K., & Kuehn, K. M. (2016). Open Source, Social Activism and” Necessary Trade-offs” in the Digital Enclosure: A Case Study of Platform Co-operative, Loomio. org. tripleC: Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society, 14(2), 413–427.

Jakubowicz, A. (2017). Alt_Right White Lite: trolling, hate speech and cyber racism on social media. Cosmopolitan Civil Societies: An Interdisciplinary Journal, 9(3), 41.

Jakubowicz, A., Dunn, K., Mason, G., Paradies, Y., Bliuc, A.-M., Bahfen, N., Oboler, A., Atie, R., Connelly, K. (2017). Cyber Racism and Community Resilience: Strategies for Combating Online Race Hate: Springer, 5-11, 242. doi:10.1007/978-3-319-64388-5

Kern, A. (2017). The effect of direct democratic participation on citizens’ political attitudes in Switzerland: The difference between availability and use. Politics and Governance, 5(2), 16-26. doi:http://dx.doi.org/10.17645/pag.v5i2.820

Kern, A. (2018). What happens after a local referendum? The effect of direct democratic decision-making on protest intentions. Local Government Studies, 44(2), 183-203. doi:1 0.1080/03003930.2017.1411809


Kim, S., & Lee, J. (2012). E-Participation, Transparency, and Trust in Local Government.

Public Administration Review, 72(6), 819-828. doi:10.1111/j.1540-6210.2012.02593.x


NZLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.nzlii.org/nz/journals/NZLFRRp/2019/3.html