NZLII Home | Databases | WorldLII | Search | Feedback

New Zealand Law Students' Journal

You are here:  NZLII >> Databases >> New Zealand Law Students' Journal >> 2015 >> [2015] NZLawStuJl 4

Database Search | Name Search | Recent Articles | Noteup | LawCite | Download | Help

Turnbull, Mahoney --- "Navigating New Zealand's digital future: coding our way to privacy in the age of analytics" [2015] NZLawStuJl 4; (2015) 3 NZLSJ 420

Last Updated: 20 July 2016


NAVIGATING NEW ZEALAND’S DIGITAL FUTURE: CODING OUR WAY TO PRIVACY IN THE AGE OF ANALYTICS

MAHONEY TURNBULL*


This article was chosen as first equal for best article by the Editor-in-Chief and the Academic Review Board.

  1. Introduction

We are now in the midst of a data revolution and New Zealand’s digital future is uncertain. The data path we are steering towards is taking us into new cyber territory and is challenging fundamental concepts in privacy and data protection law. In this new digital terrain, big data represents a highly valuable reserve of personal information ripe for the picking. It is ‘the new oil’,1 and the digital space that this sought-after commodity operates in is highly unregulated and open to exploitation. New Zealand’s position on the extraction of this critical resource will be indicative of our commitment to enabling protected use of shared data to deliver a prosperous society. New Zealand’s policy is suffering a critical regulatory disconnect, with technology fast outstripping the legislation that exists in the data protection domain. This article addresses the question of how New Zealand’s data stewardship should be governed, with a view to proposing how personal data management could benefit from clear national guidelines.

* BA/LLB(Hons), University of Otago. Mahoney is employed at New Zealand Trade and Enterprise in the United States.

1 Bruce Schneier, Chief Security Technology Officer, British Telecom “Privacy in the Age of Big Data” (speech presented to the New Zealand Privacy Forum, Wellington, May 2012) available at

<https://www.youtube.com/watch?v=L_UIdkbp3xo>.



Part II provides context to the burgeoning industry surrounding big data and the role that it now plays in the ever-evolving digital economy. First, I consider the ecosystem in which big data lives, and the positive outcomes from strategic use and reuse of the wealth of data available. This is balanced by an examination of the more disturbing uses of data that have the potential to cause devastating data ‘oil spills’ and privacy scares. Following on from the contextual analysis, Part III outlines New Zealand’s current legal framework, and whether our data protection architecture can truly claim ‘adequate’ status. This will focus on the key features of the Privacy Act 1993 that arguably place New Zealand’s data protection future in good stead. In contrast, this section will also elucidate the technical inconsistencies that have emerged, and make the Act outmoded. Expanding upon the evolving genre of what constitutes personally identifiable information, this analysis will probe into the current de-identification techniques and the shift towards personally ‘predictable’ information; both trends that challenge the efficacy and endurance of the Privacy Act. Furthermore, it will highlight the need to progress towards more relevant legal tools, to oversee responsible data disclosure behaviour.

Picking up on this concern, Part IV explores the regulatory remedy that New Zealand could pursue in order to create progressive mechanisms that enable a coherent data management framework. By analysing examples of Standards Authorities and recent legislative changes, this section lays out the potential scope of the proposed Data Standards Authority, and the administrative features that would need to be addressed in forming this body.

The final part elucidates the principles that ought to govern the Data Standards Authority. It offers concrete suggestions for two overarching principles that may serve as valuable touchstones for the resulting data standards that would be contained within industry-specific codes. The initial focus will be the Privacy by Design principle, involving clarification of de-identification protocols that reflect the importance of technological systems in tackling the legal issues at stake. The second


principle will address the core problem of data empowerment, and offer three ways in which consent can be enhanced, through informed and live consent. My proposal promotes a paradigm shift in privacy policies towards the idea of user-centricity, and the creation of consumer friendly privacy settings. It encourages a reshaping of the rules of engagement to secure a more responsive data ecosystem that can unlock the value of data within a more digitally relevant legislative landscape.
  1. The Seed from which Data Grew

This part establishes the potential benefits and risks that have arisen from the big data industry. It will shed light on the nature of the data ecosystem and the value that lies in effective use of personal data sets. It also lays out the inherent risks associated with this unregulated industry, which has the power to generate privacy breaches through adverse and discriminatory profiling. Exposing the harms that flow from data misuse, this part will underline the importance of effective data regulation, for New Zealand to maximise the value from granular analysis of people, behavioural patterns, and the environment.

~~~~~

It is hard to imagine the world without the internet. A world without data and the ubiquitous connectivity that we as digital natives feel empowered to engage with. Our digital habitat is one where neither borders nor language appear as barriers to communication.2 It is within this environment that we are witnessing the dawn of the data-driven era. A big data tsunami has risen, and is prompting a new industrial revolution driven by analytics, computation and automation. The

2 Ronald Deibert and Rafal Rohozinski “Beyond Denial” in Ronald Deibert, John Palfrey, Rafal Rohozinski, and Jonathan Zittrain (eds) Access Controlled, The Shaping of Power, Rights, and Rule in Cyberspace (Cambridge, MIT Press, 2010) at 9.



tracking of human activities, industrial processes and research is all leading to data collection and processing of an unprecedented scale, spurring new products and services as well as business opportunities and scientific discoveries.3

The term, ‘Big data’, refers to datasets beyond the scale of a typical database, which are held and analysed using computer algorithms.4 It is the ‘non-trivial extraction of implicit, previously unknown and potentially useful information from data’. 5 In essence, the concept of big data combines more data, faster computers and novel analytics which organisations, government and businesses use to extract both

3 European Commission Communication from the Commission to the European Parliament, The Council, The European Economic and Social Committee and the Committee of the Region: Towards a Thriving Data Driven Economy (Brussels, July 2014) available at

<http://ec.europa.eu/information_society/newsroom/cf/dae/document.cfm? doc_id=6216> at 2; Cisco, Cisco Visual Networking Index: Global Mobile Data Traffic forecast Update, 2012–2017 (Cisco, 2013) available at

<http://www.cisco.com/c/en/us/solutions/collateral/service- provider/visual-networking-index-vni/white_paper_c11-520862.pdf>. The number of mobile connected devices has now exceeded the number of people on the planet. By 2020 an estimated 50 billion devices will be wirelessly connected.

4 European Commission Privacy and Competitiveness in the Age of Big Data

(Brussels, April 2014) available at

<http://www.insideprivacy.com/international/european-union/the-new-edps- opinion-privacy-and-competitiveness-in-the-age-of-big-data/> at 6; McKinsey Global Institute Big Data: The New Frontier for Innovation, Competition and Productivity (1 May 2011) available at

<http://www.mckinsey.com/client_service/telecommunications/latest_thinki ng> at 2; Wei Fan and Albert Bifet “Mining Big Data: Current Status, and Forecast to the Future” (2012) 14 ACM at 9. This references the first time the term ‘big data’ appeared in a 1998 Silicon Graphics slide deck by John Mashey. 5 Usama Fayyad and others (eds) Advances in Knowledge Discovery and Data Mining (MIT Press, Cambridge, 1996) at 37 as cited in Tal Zarsky “Mine your own!

Making the Case for the Implications of the Data Mining of Personal Information in the Forum of Public Opinion” (2003) 5 Yale J L & Tech 2 at 6, n 13.



hidden information and surprising correlations.6 The newly discovered information that results is not only unpredictable but also results from a fairly opaque process.7

As its name implies, the hallmark of big data is its quantitative greatness. In juxtaposition to the gains made from shrinking scope in the field of nanotechnology, big data gains its force from its sheer magnitude.8 Now estimated to be in the order of zettabytes,9 the phenomenal production of data coupled with escalating storage capacity is enabling collection and sharing of information at unprecedented levels.10 Although in the analogue age this kind of storage was costly and time-consuming, the current trend of ‘datafication’ and cloud-based servers is enabling rapid shifts. This change of scale has led to a change of state, and the quantitative growth is now prompting a qualitative one.11


6 Ira Rubenstein “Big Data - The End of Privacy or a New Beginning?” (2013) 3(2) International Data Privacy Law 74 at 1; Viktor Mayer-Schönberger and Kenneth Cukier Big Data: A Revolution That Will Transform the Way We Live, Work and Think (1st ed, Eamon Dolan, New York, 2013) at 7. Society will need to shed some of its obsession for causality in exchange for simple correlations.

7 Tal Zarsky “Desperately Seeking Solutions: Using Implementation-Based Solutions for the Troubles of Information Privacy in the Age of Data Mining and the Internet Society” (2004) 56 (13) Me L Rev at 13.

8 At 10.

9 A zettabyte is equivalent to one sextillion bytes, or two to the seventieth power.

10 Julie Brill, Federal Trade Commissioner “Reclaim your name” (speech presented at NYU Sloan Lecture Series: Privacy in the World of Big Data, NYU, October 2013) available at

<http://engineering.nyu.edu/sloanseries/reclaim-your-name.php>.

11 Mayer-Schönberger and Cukier, above n 6, at 5–10. The fact that 90 per cent of the world’s data was only generated in the last two years, with this figure doubling every two years from now on, indicates the overwhelming pace at which big data is growing, as a wholly regenerative resource. If all the data today was placed on CDs and stacked up, it would stretch to the moon in more than five separate piles.



Compounding this issue is the Internet of Things (IoT).12 This phenomenon reflects the ‘machine-plus-human’ hybrids that life in the digital age is making more mainstream.13 Our lives are digitally disassembled, disaggregated and dispersed into a multitude of digital domains.14 Within this space, we are seeing the rise of connected devices, which will push data accumulation to unparalleled levels.15 The increasing number of people, devices, and radars that are now connected by digital networks has revolutionised the ability to generate, access and share data.16 Mobile devices are not the only sensory gateway, as embedded technologies that are passively collecting data pervade the marketplace.17 This trail of digital breadcrumbs, via the world of ambient intelligence, is creating an immense data ocean18 in


12 Mireille Hildebrandt “Who is Profiling Who?” in Gutwirth and others (eds) Reinventing Data Protection (Springer, Amsterdam, 2009) at 239.

13 Lisa Gitelman (ed) Raw Data is an Oxymoron (MIT Press, Cambridge, 2013) at 10.

14 Deibert, above n 2, at 9. In this sense, cyberspace is not such a distinct realm as it is the very environment in which we inhabit.

15 Larry Hardesty “Algorithm recovers speech from vibrations of potato-chip bag filmed through soundproof glass” (August 4, 2014) Phys.org

<http://phys.org/news/2014-08-algorithm-recovers-speech-vibrations-potato- chip.html>. The emerging possibilities in gathering data on physical assets could also generate a new level of data signals. MIT researchers are now reconstructing audio signals by analysing vibrations of objects.

16 Jules Polonetsky and Omer Tene "Big Data for All: Privacy and User Control in the Age of Analytics" (2013) 11 Nw J Tech & Intell Prop 11 (5) 239 at 241. 17 Rubenstein, above at n 6, at 77. By 2020 the majority of data will be collected passively and automatically: Drew Olanoff, “Google wants to serve you ads based on the background noise on your phone calls” (21 March 2014) The Next Web <http://thenextweb.com/google/2012/03/21/google-wants-to- serve-you-ads-based-on-the-background-noise-of-your-phone-calls/>. To this end, Google has already patented targeted ads that listen to the background noise in your phone call to deliver targeted advertising.

18 Email from Mia Garlick, Head of Policy, Facebook Australia and New Zealand to Mahoney Turnbull regarding data governance structures (8 August 2014).



which the race to create new algorithms is pulling us in diverging directions.19

Whilst this data, hailed as the ‘new oil’, may be ripe for mining, it also poses considerable risks. Privacy expert Bruce Schneier has been a strong advocate of the data pollution problem reflecting our tendency to storm into a digital era whilst naively overlooking the deluge of data.20 True to Moore’s law,21 a new landscape of data accumulation has emerged22 giving rise to the infinite ‘digital tattoo’.23

The business model of the digital ecosystem24 is geared towards our commodification. In this sense, the users are the “products not the customers”, and are responsible for generating the value as well as the by-product.25 It is becoming clearer that big data poses significant challenges to the sanctity of the individual.26 The data dependency is an inequitable one in which data assets are subject to market distortion which inhibits users from gaining true value for their data.27 To facilitate this undemocratic process, a culture is developing in which

19 Fan and Bifet, above n 4, at 1.

20 Schneier, above n 1.

21 Moore’s law dictates how overall processing power for computing will double every two years. True to this phenomenon, there has been simultaneous reduction in storage costs and increase in data production.

22 OECD Thirty Years After: The OECD Privacy Guidelines (OECD, 2011) available at <http://www.oecd.org/sti/ieconomy/49710223.pdfAddress> at 8. 23 Juan Enriquez “How to think about digital tattoos” (podcast, December 2012) TedTalks

<https://www.ted.com/talks/juan_enriquez_how_to_think_about_digital_tatt oos>.

24 Andrew McAfee “Big Data: The Management Revolution” Harvard Business Review (online ed, Boston, December 2012).

25 Sive Vaidhyanathan The Googlization of Everything (And Why We Should Worry) (University of California Press, Berkeley, 2011) at 111. The data users not only provide the raw materials to determine and deliver relevant search ads, but are used to train its search algorithms to develop new data intensive services.

26 Mayer-Schönberger and Cukier, above n 6, at 17.

27 Schneier, at 21.



socio-technical systems are expertly configured to obscure privacy features. The veil that can be pulled over user’s eyes promotes a sense of the unknown, to the extent that individuals are now signing over their children for access to desirable online platforms.28 The issue of consent, or lack thereof, is addressed in part V.

The purchasing power of data has been hailed as a disruptive force to the current business model. The ‘freemium’29 model is a contentious element of the big data sensation, and highlights the core reliance on accessible data extraction to enable the data monetisation machine to run smoothly. Firms will not realistically provide free services for free unless doing so enhances their data harvest through valuable sets of personal data points.30 This industry certainly has the potential to develop anti-competitive behaviour with data brokers mediating the trade in data and overseeing the increasing digital servitude.31 The bewildering acceptance of the emptiness of ‘free services’ seems to

28 Tom Fox-Brewster “Londoners give up eldest children in public Wi-Fi security horror show” (29 September 2014) The Guardian

<http://www.theguardian.com/technology/2014/sep/29/londoners-wi-fi- security-herod-clause>.

29 The ‘freemium’ business model is one in which the company gives away the core product for free to the majority of users and sells premium products to a smaller fraction of this user base.

30 Viviane Reding, Vice Commissioner European Commission “Making Europe the Standard Setter for Modern Data Protection Rules in the Digital Age” (speech presented to the Digital Age Innovation Conference, DLD Munich, January 2012) available at <http://europa.eu/rapid/press-release_SPEECH- 12-26_en.htm>.

31 European Commission, above n 4, at 10. Data brokers collect personal information about consumers and sell that information to other organisations using a variety of public and non-public sources including website cookies, and loyalty card programs to create profiles of individuals for marketing and other purposes; Alexandra Suich “Special Report Advertising and Technology: Getting to Know You” The Economist (13 September 2014) at 5. Data broking firms may specialize in selling certain segments, such as eXelate, which sells “men in trouble”, whereas the IXI firm specialize in the “burdened by debt” segment.



indicate online platforms may well be as powerful a narcotic as the Soma was in Huxley’s ‘Brave New World’.32

It is against this backdrop of data wealth that we are witnessing a global call to embrace the digital data renaissance.33 Industries are moving towards data-driven systems34 with personal information now operating as the currency of the digital economy, which is growing at unprecedented levels. This is no ordinary asset, but one that can offer a steady stream of innovation and new services to those with the humility, willingness and the tools to listen.35

Yet in the face of this compelling movement, the New Zealand economy has shown a considerable lag in embracing the data revolution and could also be criticised for lacking sufficient R&D on data. Combined with a shortage of data experts, there is a definite lack of industrial capability when compared to countries like the United States. New Zealand should consider a similar approach to the UK, which announced the establishment of a world-class research centre for big data science in this year’s budget.36 Indeed new opportunities exist in a number of sectors where the application of these methods is still in its infancy and global dominant players have not yet emerged. New

32 Aldous Huxley Brave New World (Harper Collins, New York, 2000); Alessandro Acquisiti “Why Privacy Matters” (podcast, October 18 2013) TEDtalks

<http://www.ted.com/talks/alessandro_acquisti_why_privacy_matters>. These online ‘free to download’ games may expand our digital freedom, yet also carry the price of privacy invasion and exploitation.

33 Ian Fletcher, Director Government Communications Security Bureau “Privacy and Security: Identity, society and the state in the internet age” (speech at NZ Privacy Forum Week, Wellington, 7 May 2014) at 2.

34 Schneier, above n 1.

35 Mayer-Schönberger and Cukier, above n 6, at 5.

36 Department for Business Innovation and Skills “Plans for World Class Research Centre in the UK” (United Kingdom Government, 19 March 2014) available at <https://www.gov.uk/government/news/plans-for-world-class- research-centre-in-the-uk>.



Zealand has the chance to capitalise on this gap and ensure that a robust regulatory framework is created. Treating data as a strategic asset that benefits from clear governance machinery and legal protections will ensure that the data-use ecosystem can move with the pace of this industry.
  1. The ‘Big’ Benefits of Big Data

“The ability to see the details of the market, of political revolutions, and be able to predict and control them is definitely a case of Promethean fire – it could be used for good or for ill, and so Big Data brings us to interesting times. We’re going to end up

reinventing what it means to be a human society”.37


Whilst a lot of criticism has been levelled at the wave of big data flooding our digital environment, there is no doubt that the “dual use”38 of big data can be readily harnessed to serve the public good in a multitude of ways. The increasing synonymy of big data with data analysis, which is the lynchpin of modern science, considerably constrains any argument against its fundamental value.39 It is the new ‘final frontier’ for scientific data research and we seem to be at the beginning of a new era in which we are unearthing novel knowledge.40 Big data will yield important benefits, whether applied to medicines, climate, food safety or geo-spatial mapping.41 Moreover, in the

37 New Zealand Data Futures Forum (NZDFF) Full Discussion Paper (New Zealand, 2014) available at

<https://www.nzdatafutures.org.nz/sites/default/files/first-discussion- paper_0.pdf> at 10.

38 Executive Office of the President Podesta Report: Big Data: Seizing Opportunities, Preserving Values (Washington, 1 May 2014) available at

<http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_repo rt_5.1.14_final_print.pdf> at 56. This refers to the “dual use” of data, as the contextual use can either be beneficial or harmful.

39 At 340.

40 Fan and Bifet, above n 4, at 4.

41 Paul Ohm “The Underwhelming Benefits of Big Data” (2013) 161 U PA L Rev Online 339 at 339; Jan Eliasson, Deputy Security General “Remarks on a



commercial sphere, global studies show that it can create ‘significant value for the world economy, enhancing the productivity and competitiveness, and creating substantial economic surplus for consumers.”42 The gains to be had from big data are certainly big, and have the power to generate new, life-enhancing outcomes.

Big data offers the capacity to unleash a wave of innovation through the ‘featurization’ of data.43 As big data pioneer Sandy Pentland has noted, big data has the power to bring to light information about people’s behaviour.44 The most valuable class of big data does not originate from Facebook posts or RFID’s for instance, but from the behaviour-based digital footprints like location data, credit card data and quantified-self data. Importantly, this data manages to operate free from the self-editing that underpins personal posts, on platforms like Facebook. Extrapolating certain behaviours derived from evidence not explicitly in the data enables a powerful flow-on effect of comparable analytics. Companies are no longer confined to averages, but have in their possession data that is opening up astounding changes in the granularity45 of individual analysis.46




Data Revolution for Sustainable Development” (Speech presented to the United Nations Independent Expert Advisory Group for Big Data, 24 September 2014) available at

<http://www.undatarevolution.org/2014/09/26/deputy-secretary-generals- data-revolution/>.

42 McKinsey, above n 4, at 1–2.

43 Tene and Polonetsky, above n 16, at 242. The ‘featurization’ refers to user- side applications and services based on access to personally identifiable information.

44 NZDFF, above n 37, at 10.

45 The ‘granularity’ of data refers to the customised breakdown of personal data sets that offers greater insights into an individual’s behaviour patterns.

46 NZDFF, above n 37, at 10.



The connecting force of big data offers “super wicked”47 correlations between behaviours and outcomes. There are distinct advantages when compared to traditional forms of web science analysis, particularly when examining financial bubbles and recessions.48 Recent research from Warwick and Boston Universities has produced fresh evidence of methods identifying search terms that precede stock market crashes.49 The real value also lies in these predictive modelling techniques being applicable to other commercial factors, which could signal a new modus operandi for the financial industry.

The inherent human component of big data also enables the analysis to assume a more holistic form of knowledge discovery. For the public sector, ‘smart data’ can lead to stronger policymaking decisions by providing sophisticated evidentiary bases.50 Smart data can then be used in real time to monitor the efficacy of policy decisions and allows for adjustments, which can make solutions even more effective.

47 Jonathan Boston “A New Global Climate Change Treaty – Can Humanity Deliver? Our Challenge after Durban for 2015” (paper presented at University of Otago, Dunedin, 14 March 2012) at 4. “A super-wicked problem has the following characteristics: the policy is complex and controversial, with competing problem definitions; all the available solutions are problematic; delay is costly; those most responsible for the problem have the least incentive to solve it’ and the central control or enforcement mechanisms are weak”.

48 NZDFF, at 11.

49 Alice Truoing, “How Google searches can predict the next stock market crash” (24 July 2014) Fast Tech Company

<http://www.fastcompany.com/3033661/fast-feed/how-google-searches-can- predict-the-next-stock-market-crash>. By correlating the most valuable information in search engine data that have less obvious semantic connections to events, the potential exists for historic links to be gauged, and future falls anticipated.

50 Spark “Submission to the New Zealand Data Futures Forum” (Wellington, July 2014). Spark chooses to use the term ‘smart data’ rather than big data.

Smart data can provide deep analysis of a problem, help identify root causes to a problem and find correlations with other data. The use of this term emphasizes the latent value inherent in data sources, and also avoids the negative connotations of the harms connected to big data.



Harnessing big data for development is another strategic outcome. Humanitarian-orientated ‘Born Digital’ projects are indicating the transformative impact of data through real-time feedback and early warning capabilities.51 Catalyst projects such as ‘Global Pulse’ and UN work in Asia, attest to the power of detecting emerging vulnerabilities.52 Looking at its use in the developed world, the number of lives ‘saved’ by a Stanford professor pursuing data mining techniques and novel signal-detection algorithms, reinforces the significant gains in healthcare that can flow from big data.53

The economic benefits from geospatial data are equally optimistic. The flood of fresh sensing data, combined with ‘smart grid’ functionality is signalling a new era of ‘sensing cities’ as seen in the context of Christchurch.54 Access to mobility data to track population trending patterns could help spur constructive outcomes in terms of Auckland’s housing developments issues.55 The working relationship between Auckland Council and citizens to enable strategic assessment of growth capacity is just the beginning of new data-driven methodologies for dynamic public engagement.
  1. The ‘Big’ Concerns of Big Data: The Era of Predictive Analytics

Big data poses serious privacy concerns that could stir a regulatory backlash, stifle innovation and dampen the data economy. The risks we

51 Price Waterhouse Coopers PWC Big Data: big Benefits and imperilled Privacy

(United States, June 2014) at 5.

52 Fan and Bifet, above n 4, at 2.

53 Tene and Polonetsky, above n 16, at 246. This data study showed the adverse effects of a diabetic drug by exposing the correlation of 27,000 cardiac arrests from using the drug. This led to the drug’s withdrawal from the market.

54 Sensing Cities “Project to Create Sensing Cities Launches in Christchurch” (4 September 2014) Sensing City <http://www.sensingcity.org/stay- informed/project-to-create-%E2%80%98sensing-cities%E2%80%99-launches- in-christchurch>.

55 Interview with Cyrus Facciano, General Manager of Qrious (Mahoney Turnbull, 25 July 2014).



are seeing emerge have the potential to override the value to be gained from smart data.56 Thus, stronger data protection and data management must be engineered. New Zealand’s legal mechanisms that deal with privacy and data protection should be re-examined and refreshed to cope with the negatives of predictive analytics.

The prime cause for anxiety stems from how individuals can be profiled and targeted.57 This poses a serious threat to data subjects being able to exercise inherent freedoms safeguarded in the New Zealand Bill of Rights, namely Freedom of Expression and Freedom of Thought, Conscience and Religion.58 In the context of big data, this manifests in the restrictions on an individual’s capacity to act with agency and consume online information without being subject to unjustifiable manipulation. The issue here is not aggregation, but rather disaggregation of personal insights that can be brokered and used against the individual.59 There is no shortage of evidence for the ability of analysts to proactively anticipate, persuade and manipulate individuals and markets.60 The criticisms directed at companies who “vampirically feed of our identities” should not be taken lightly, and highlights the looming ‘dataveillance’ that is casting big data in a darker light.61 The most recent White House Report has reinforced this


56 Jan Eliasson, above n 41.

57 Rubenstein, above n 6, at 24; Ryan Calo “Digital Market Manipulation” Geo Wash L Rev (2014) (forthcoming).

58 New Zealand Bill of Rights Act 1990, ss 13–14.

59 Fletcher, above n 33.

60 World Economic Forum Rethinking Personal Data (Geneva, May 2014) available at <http://reports.weforum.org/rethinking-personal-data/> at 24. 61 Gitelman, above n 13, at 10. Surveillance in the context of big data is an

expansive term, and not just limited to espionage or video monitoring, but “any collection and processing of personal data, whether identified or not, for the purposes of influencing and monitoring those whose data has been garnered.” See David Lyon The Surveillance Society (Open University Press, Philadelphia, 2001) at 3.



sentiment and called for expanded technical expertise to halt the discrimination leading big data down a digitally manipulative track.62

The danger of predictive profiling is a persuasive factor in the appeal for a stronger data protection regime. The ‘pregnancy score’ formulated by Target provides one pertinent example of how the big data industry is encroaching on the personal realm and resulting in discriminatory profiling and constraining fundamental freedoms.63 Corporates are becoming increasingly adept at executing profiling with alarming specificity and foresight. Target’s capacity to employ time- tracking analytics on the types of purchases made by customers, enabled a timeline that predicted precise stages of their customers’ pregnancy cycles.64 It was against this backdrop that the ‘creepiness’ Panopticon-like threshold set in,65 and customers began to question the extent of Target’s consumer tracking systems.66 This predictive analysis is disturbing when sensitive categories protected by New Zealand’s rights-based legislative instruments, such as health, race and sexuality,

62 Executive Office of the President, above n 38, at 30. The Report highlights the capacity to segment data subjects, and stratify customer experiences so seamlessly as to be almost undetectable.

63 Charles Duhigg “How Companies Learn Your Secrets” New York Times Magazine (online ed, New York, 16 February 2012). This revealed the situation here the girl’s father only discovered his teenage daughter was pregnant after Target had pre-determined this via her buyer behaviour and sent various pregnancy related promotional material to the home address.

64 Tene and Polonetsky, above n 16, at 253.

65 Jeremy Bentham Panopticon; Or, The Inspection-House: Containing The Idea of a New Principle of Construction applicable to any Sort of Establishment, in which Persons of any Description are to be kept under Inspection: And in Particular To Penitentiary-Houses, Prisons, Houses of Industry, Workhouses, Poor Houses, Manufactories, Mad-Houses, Lazarettos, Hospitals, And Schools: With a Plan Of Management adapted to the principle: in a series of letters, written in the year 1787, from Crecheff in White Russia (T Payne, London, 1791).

66 Quentin Hardy “Rethinking privacy in an Era of Big Data” The New York Times (4 June 2012) <http://bits.blogs.nytimes.com/2012/06/04/rethinking- privacy-in-an-era-of-big-data/?_php=true&_type=blogs&_r=0>.



are compromised.67 It is one thing for a customer to be recommended books they may be interested in to enable more ‘efficient’ consumption patterns, but it is quite another to surreptitiously track when a customer is pregnant before her closest family even know. Alarmingly, the accumulation of knowledge organisations hold about users entitles them to infer desires before individuals even form them, and to buy products on their behalf before they even know they need them.68

The profiling problem and its threat to freedom from discrimination can also be seen in the automated decision-making assumptions.69 This situation seems to have the hallmarks of Chomsky’s ‘manufactured consent’,70 where data controllers have enormous discretion in determining what the user ‘wants’ to see. The trend towards ‘dynamic pricing’ is shifting focus onto browser history and postcodes as the key pricing mechanisms in online shopping experiences.71 Invisible decisions made on the basis of data-driven assumptions also run the risk that users, faced with increasing privacy intrusions, will decide to forgo online-enabled services. Not only does this deepen the digital divide,72 and exacerbate issues around s 14 of the Bill of Rights Act, but


67 New Zealand Human Rights Act 1993, s 21.

68 Acquisiti, above n 32.

69New Zealand Bill of Rights Act 1990, s 14.

70 Noam Chomsky Manufacturing Consent: The Political Economy of the Mass Media

(Pantheon, New York, 1988).

71 Thorin Klosowski “How Websites Vary prices Based on your Information (and what you can do about it)” LifeHacker (July 2013)

<http://lifehacker.com/5973689/how-web-sites-vary-prices-based-on-your- information-and-what-you-can-do-about-it>. Dynamic Pricing encompasses the trend of price variability based on location data.

72 Statistics New Zealand “The Digital Divide” (Wellington, 2013) available at

<http://www.stats.govt.nz/browse_for_stats/industry_sectors/information_te chnology_and_communications/digital-divide/introduction.aspx>; Joy Liddicoat Association for Progressive Communications New Zealand Digital Freedoms Report (Wellington, 2014) available at <https://www.apc.org/en/irhr/i- freedom-nz/about>.



also spurs negative impacts on innovation and engagement in the digital economy.73

“Big data is coming, like it or not. We have an opportunity to shape it, to ensure it operates for us, not on us. The coming debate whether and how we might do this promises to be a
vigorous one.” 74

  1. New Zealand’s Data Protection Architecture

“Code changes quickly, user adoption more slowly, legal contracting and judicial adaptation to new technologies slower yet, and regulation through legislation slowest of all. 75


This section outlines New Zealand’s legal position on data protection that enables protection of personal information, the structure of our privacy architecture and the international influences at play. It then explains the technical inconsistencies concerning how the Privacy Act recognises personal information focusing on the legal loophole created by the outmoded rationale that de-identification techniques can ensure non-identifiability. This part will also expose the new category of personally predictable information, and will explore the issues regarding Principle 3 of the Act. It will begin to consider which legal tools may be required to tackle the divide between technological advancements and privacy safeguards.
  1. The Privacy Act: An ‘Adequate’ Instrument?

The New Zealand position on data protection has its origins in the Universal Declaration of Human Rights 1948 and the International Covenant on Civil and Political Rights 1966, which both acknowledge

73 Barbara Daskala and Ionnis Maghiros Digital Territories: Towards the Protection of public and private space in a digital and Ambient Intelligence environment (Institute for Prospective Technological Studies, Seville, 2007) at 11.

74 Ohm, above n 41, at 346.

75 Ian Brown Regulating Code (MIT Press, Cambridge, 2013) at xv.



the right to privacy as a fundamental human right.76 New Zealand’s current data protection law has been strongly influenced by the 1980 OECD Guidelines on the Protection of Privacy and Trans-border Flows of Personal Data, which sets out eight core principles to protect data.77 These principles are reflected in the New Zealand Privacy Act
1993.78 Additional data protection rights are contained in the NZ Bill of Rights Act, which affirms rights against unreasonable search and seizure and liberty of the person.79

The influence of the OECD Principles and New Zealand’s commitment to them is evidenced in the 2010 amendment of the Privacy Act.80 These Guidelines, rooted in strong rights based ideal, reflect New Zealand’s commitment to advancing human rights and the free flow of information and ideas.81 The Privacy Act avoids taking a proscriptive approach and instead lays out twelve principles that apply to both the public and private sectors when they hold “personal information” about a natural person.82 A positive feature of the Act is the latitude in application of the principles to suit the circumstances of a wide variety of different agencies.83 The wider spectrum includes both persons and companies, yet excludes various branches of the Executive

76 Interview with John Steadman, Legal counsel at Spark (Mahoney Turnbull, 8 July 2014).

77 OECD OECD Guidelines on the protection of Privacy and Transborder Flows of Personal Data (Geneva, 1980) available at

<http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionofp rivacyandtransborderflowsofpersonaldata.htm>.

78 New Zealand Privacy Act 1993.

79 R v Jefferies [1993] NZCA 401; [1994] 1 NZLR 290 (CA).

80 New Zealand Privacy Act 1993, Annex 5A; Michael Kirby “Legal Aspects of Transborder Data Flows” (1991) 11(3) Computer L J 233 at 234.

81 Lee Bygrave Data Protection Law: Approaching its Rationale, Logic and Limits

(Kluwer Law International, The Hague, 2002) at 113.

82 Interview with John Steadman, Legal counsel at Spark (Mahoney Turnbull, 8 July 2014).

83 New Zealand Law commission Questions and Answers to the Law Commission Review 2011 (Wellington, August 2011) at 1.



(such as Ministers) and the news media.84 This flexible approach also helps with adaptation to new technologies and shifts in privacy expectations.

From the Privacy Commission’s perspective, the Act sits as a leading “jurisdictional benchmark” in its ability to manage the different values and interests in a data driven future.85 Thus, it offers a competitive advantage and an excellent platform from which to strengthen and modernise in the age of analytics.

Such international repute prompted the European Union to recognise New Zealand’s Act as offering an ‘adequate’ standard of data protection for the purposes of European Law. This recognition reflects Europe and New Zealand’s common commitment to upholding human rights and is a claim only a handful of other countries can assert.86 The ability for European businesses to transfer data to New Zealand without requiring special contractual provisions is an important commercial consideration for New Zealand companies wanting to offer data processing services on a global scale.87 It is important to note that

84 At 4.

85 John Edwards “New Zealand’s Data Future: A View from the Privacy Commissioner” (Wellington, 4 July) at 1; See Bruce Arnold’s analysis in Bruce Baer Arnold “Ending the OIAC and new frameworks for privacy law” (2014) 11(5) Privacy Law Bulletin 66 at 66.

86 European Commission Directorate of General Justice Opinion 11/2011 on the level of protection of personal data in New Zealand (Brussels, 2011) available at <http://ec.europa.eu/justice/data-protection/article- 29/documentation/opinion-recommendation/files/2011/wp182_en.pdf#h2- 13> at 5; New Zealand Office of the Privacy Commissioner “NZ Data Protection gets tick from EU Committee” (13 April 2011)

<http://privacy.org.nz/news-and-publications/statements-media-releases/nz- data-protection-law-gets-tick-from-eu-committee/>.

87 EU Data Protection Law “EU Data Protection Regulation Timeline” (13 May 2014) <www.eudataprotectionlaw.com>. The need for New Zealand and EU alignment highlights the need to take note of upcoming changes to the EU’s Data Protection Directive, which will reach final agreement in 2015. The next phase will be the Council of Ministers meeting to revise the text in



although there are no specific provisions protecting data transferred to third countries, s 10 provides for situations when data is collected from New Zealand, and a New Zealand agency transfers information offshore.88 In this instance, the New Zealand-based disclosing agency will remain liable for any subsequent breaches. The EU Working Party’s report alerted the Privacy Commissioner (PC) to the need to maintain oversight of transfers to countries who do not have ‘adequacy’ status.89 It is in the interests of New Zealand companies and policymakers to minimise risks of harm or loss by establishing strong data management frameworks. The value of New Zealand’s alignment with OECD guidelines reinforces the need for both New Zealand and the EU to be acutely aware of advances in big data, to ensure the technical realities translate into privacy protection.90
  1. Two ‘Key Features of the Act’ from the Commissioner’s Standpoint

In the recent submission to the NZDFF, the PC asserted two key features of New Zealand’s privacy law that render it an effective model to address some of the big data challenges.91

October 2014. It will again be analysed at Forum Europe’s 5th Annual Data Protection Conference on 9 December 2014; Hunton Williams “Privacy Law Update” (podcast, 16 September 2014)

<www.hunton.com/media/20140916_privacy/20140916_privacyupdate2_Mo no2.mp3>.

88 New Zealand Privacy Act 1993, ss 10 and 3(4).

89 Cabinet Social Policy Committee “Government Response to Law Commission Report: Review of the Privacy Act” (12 March 2012) SOC Min

(12) 3/1 at 2. The issue of international interactions also prompted the Law Commission to recommend a new obligation to ensure overseas recipients are able and willing to observe acceptable privacy standards.

90 The need for New Zealand to stay in line with the EU Data Protection Directive will help ensure a new approach does not lead to trading opportunities for “New Zealand Inc” being jeopardised. An entirely new approach for New Zealand’s Privacy Act would only create medium to long- term uncertainty.

91 Cabinet Social Policy Committee, above n 89, at 4.



The first is the breadth of the definition of ‘personal information’, which allows the Act to encompass de-identified and pseudonymous information.92

The most recent Law Commission Report explained that the definition of personal information only requires that the individual be ‘identifiable’, as opposed to ‘identified’.93 A test akin to the United Kingdom’s ‘reasonableness’ criteria for identifiability was proposed, whereby identification must be “reasonably practicable” and not simply theoretically possible.94 To adequately tackle this issue, the Commission considered it most fitting for the PC to release guidance material.95

The second aspect entails the broad exceptions to principles on collection, use and disclosure, where information will be used in a form in which individuals will not be identified.96 This means if agencies have a lawful purpose for collecting personal information and do not intend to use it in a form in which individuals will be identifiable, then they are free to do so without having to obtain consents that apply to all future uses.97

The PCs confidence in the available exceptions98 to cater for beneficial re-use of data echoes the Law Commission’s conclusion. However the

92 John Edwards, above n 85, at 3.

93 New Zealand Law Commission Review of the Privacy Act 1993: Review of the Law of Privacy Stage 4 (Issues Paper, 2010) at 3.20.

94 At 2.53. Other jurisdictions such as the UK have required the Information Commissioner to release guidance elaborating on the EU Data Protection Directive that it must be more than a “hypothetical possibility” of identifiability.

95 Law Commission, above n 93, at 3.20; Cabinet Social Policy Committee, above n 91, at Attachment 1.

96 John Edwards, above n 85, at 3.

97 New Zealand Privacy Act 1993, s 6, Principle 3.

98 At s 6, Principles 10 (f)(i) and (ii), and 11 allow an agency to use the information as it wants, provided it is used in a form in which the individual concerned is not identified, or is used for statistical or research purposes and



Commission failed to acknowledge the re-use issues related to aggregated data sets stemming from groups as opposed to single individuals.99 The consensus was that if the uses of aggregated information as described by Gunasekara were a problem, they could be dealt with in other ways, such as through consumer legislation.100

The use that Gunasekara was referring to relates to the predictive profiling that may have discriminatory or otherwise adverse effects on individuals. This foresight was a valuable addition to the review, yet was largely sidelined for fear of casting the net too wide. While the issue of widespread aggregation may not have been so acute in 2011, the concern is much more real now. The technological advances since Gunasekara’s comments now pose greater privacy risks and should be at the forefront of strategic planning for data regulation.

Not only does the Act afford the Use and Disclosure exceptions, but coupled with the technological capacity of re-identification, there cannot be a “reasonable belief” that the information will be used in a form in which “the individual concerned is not identified”.101 In light of such latitude, it is understandable why disparaging comments have been directed at the statute and suggests the law may be heading for a blunt head-on collision with big data.102

Even if the Act was to acknowledge that identifiable data should be recognised in the provisions regarding disclosure limits, another issue

will not be published in a form that could reasonably be expected to identify the individual concerned.

99 New Zealand Law Commission, above n 93, at 2.50.

100 At 2.50. Auckland University academic Gehan Gunasekara’s submission was expressly mentioned. His point was that the de-identification techniques, which are prompting information to be aggregated so that it no longer relates to identifiable individuals, still enable classification into groups.

101 New Zealand Privacy Act, s 6, Principles 10 (f)(i) and 11 (h)(i).

102 Interview with Paul Roth, University of Otago Law Professor (Mahoney Turnbull, 7 August, 2014).



remains. This is due to a new subset of personal information which has emerged. Its emergence reinforces the Microsoft Privacy Summit’s conclusion that “to limit personal data to what is recognized as ‘personal’ is too narrow”.103 The result is a class of ‘personally predictable information’, that does not even hinge on being personally identifiable let alone identified.104 Recognising the inherent tensions in this nuanced category of personal information is critical in appreciating how big data can circumvent current notions of privacy law.

The Law Commission did pinpoint identifiability as a challenge, noting it would “become more acute over time”. Yet the decision not to engage proactively in reframing the nature of identifiable material indicates a lack of foresight regarding the relevance of this class of data.105 Accordingly, companies enjoy unbridled ability to leverage the exception, and justify the disclosure of effectively personal information. This implies that New Zealand’s adequacy status is dubious in the context of big data. In assessing its suitability, let us now turn to explore some of the technical inconsistencies.









103 Fred Cate and Viktor Mayer-Schönberger Notice and Consent in a World of Big Data: Global Privacy Summit Report and Outcomes (Washington, 2012) at 10; Bernard Stiegler “Die Aufklärung in the Age of Philosophical Engineering” in Mireille Hildebrandt, Kieron O’Hara and Michael Waidner (eds) Digital Enlightenment Yearbook (IOS Press, Amsterdam, 2013) at 31.

104 Andy Green “Personally Identifiable Information Hides in Dark Data (13 April 2013) Varonis <http://blog.varonis.com/personally-identifiable- information-hides-in-dark-data/>.

105 NZ Law Commission, above n 93, at 54.


  1. The De-Identification Myth

Notwithstanding the lack of guidance on the process of anonymisation,106 the technical inconsistencies that form potent threats to privacy hinge on two factors:
  1. The concept of de-identification has become increasingly outdated.107 Not only is de-identification now recognised as an illusory guard against privacy breaches, it is also subject to a re-identification arms race.108
  2. We now have a new genre of ‘personally identifiable information’ which routes around the element of identifiability.109

106 New Zealand has refrained from incorporating into the Act any specific indications on de-identification protocols or how the de-linking of personal identifiers is meant to occur. The Australian Privacy Act with its recent changes to this sphere, now references the technique of de-identification, which is bolstered by numerous anonymisation guidelines and resources released by the National Statistical Service.

107 Ann Cavoukian and El Emam Big Data and Innovation, Setting the Record Straight: De-identification Does Work (Ontario, June 2014) available at

<http://www2.itif.org/2014-big-data-deidentification.pdf> at 3. This report defines de-identification as the process of removing or modifying of both direct identifiers and indirect or quasi-identifiers, unlike ‘masking’ which only involves the removal or modification of direct identifiers. See also Spark’s submission to NZDFF (“data which has been treated to decrease the ability to be linked back to identify individuals”).

108 Paul Ohm “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymisation” (2010) 57 UCLA L Rev 1701 at 1752; Paul Schwartz and Daniel Solove “The PII Problem: Privacy and a New Concept of Personally Identifiable Information” (2011) NYU L Q Rev 1814 at 1879–1883. This race is gaining traction with computational innovation which exposes individuals to “the database of ruin”: The crossing of identity boundaries is not a new phenomenon but the ability to easily do so in the digital era is a significant innovation and represents a normative shift in social expectations of privacy.

109 Andy Green, above n 104.



De-identification and its opposing force, re-identification, are disrupting the privacy landscape.110 It is now well accepted that de- identified data sets can still be attributed to specific individuals, which casts doubt on the fundamental distinction between personal and non- personal data.111 At the same time, re-identification has heightened the harms associated with invasive aggregation methodologies by allowing data controllers to link more information to an individual’s profile.112

Whilst pro-market thinktanks may be producing evidence to prove that the risks of re-identification are grossly exaggerated,113 the vast majority of computer scientists are consistently rebutting this claim.114 Fresh evidence from Princeton scientists shows that attempts to quantify the efficacy of de-identification are unscientific and promote a false sense of security by assuming “artificially constrained models of what an adversary might do”.115



110 Paul Ohm, above n 108, at 1704.

111 The Sweeney Test, highlighted by Ohm, refers to the research pioneered by Latanya Sweeney and made accessible by Ohm. The results of the test marked a turning point in debunking de-identification as she was able to show that in 2000, 87 per cent of all Americans could be uniquely identified using only three bits of information: post code, birthdate and sex.

112 Daniel Solove “A Taxonomy of Privacy” (2006) 154 Penn St L Rev 477 at

511. Big data makes aggregation of datasets more granular, more revealing and more invasive.

113 Ann Cavoukian and Daniel Castro Big Data and Innovation, Setting the Record Straight: De-identification Does Work (Information and Privacy Commissioner, Ontario, 2014) <www.itif.org/2014-big-data-deidentifaction.pdff> at 2.

114 Arvind Narayanan and Edward Felten “No silver bullet: De-Identification Still Doesn’t Work” (unpublished manuscript, Princeton University, 2014) at 1. Relying on Protocols like anonymisation, pseudonymisation, encryption, key- sharing, data-sharing and noise addition, are insufficient.

115 At 5. The ‘penetrate-and-patch’ method that has been recommended, in which systems are fielded with live data, broken through challenges and then revised, has been largely ineffective in both traditional information security development and in de-identification efforts.



It appears that de-identification, traditionally viewed as a silver bullet, has been debunked.116 De-identified material is not a stable category, but rather a transition point to ultimate re-identification, a point which is becoming easier to reach.117 This relates to the second technical inconsistency that is threatening the data dynamic: the personally predictable nature of data.
  1. Personally Predictable Genre

The Commission’s 2011 Report recognised that an absolute ability to be ‘identified’ was no longer a reasonable standard to aspire to; the emphasis should be on being identifiable. This insight was supported by the PC recognising that over time, more information would start falling within the definition of personally identifiable information.118

Google and other similar companies made submissions opposing an expansive interpretation of personal information to the extent of identifiability. They clarified an unduly wide definition would subject service providers to “potentially unnecessary regulation regarding collection, notification and use of disaggregated and uncombined pieces of information”. These ‘pieces of information’ serve as essential data points that determine the ability of companies like Google to provide ‘freemium’119 services. They would not, Google argued, necessarily be

116 Ira Rubenstein, Ronald Lee and Paul Schwartz “Data Mining and Internet Profiling: Emerging Regulatory and Technological Approaches” (2008) U Chicago L Rev 261 at 268–269.

117 Colin Bennett and Christopher Parsons “Privacy and Surveillance” in William Dutton The Oxford Handbook of Internet Studies (Oxford University Press, Oxford, 2013) at 499. The ability to re-identify demonstrates the dangers of releasing granular information about search terms.

118 New Zealand Law Commission Report, above n 93, at 2.48. Reference was also made to the International Institute of Communications’ Report on Personal Data Management, which concluded a simplistic, binary and static data- management policy that dictates a priori whether data is considered personal, is insufficiently flexible for the rapidly evolving digital world.

119 Freemium, above n 29.



“intended to identify a particular individual”.120 While reflecting the commercial realities of the data industry’s business model, this resistance indicates a more nuanced understanding of re-identification forecasts. The Commission should have been more cognisant of this. Online service providers like Google foresaw the growing trend of inventive algorithms and the strategic significance of being able to engage in the ‘necessary’ relinking of “uncombined” data sets in ways that would not specifically subject them to privacy legal frameworks.

The ease with which ‘recalibration’ occurs has shifted. The reality is that personally identifiable information is in a state of flux.121 By using the terminology of ‘not personally identifiable’, the Act makes no distinction between data entered into standardised fields and information entered as free text.122 The development of the ‘semantic web’123 reflects the increasing flexibility with which data sets are interpreted to derive granular strains of value. This demonstrates that technologists are increasingly adept at interpreting free unstructured text and linking it back to a person.

The issue is that in the digital domain of ‘dark data’, invention of algorithms will not stop anytime soon. If Acquisiti’s work concerning augmented reality and facial recognition is anything to go by, we are still in for some major upheavals. This trend is likely to see us progressing

120 New Zealand Law Commission Report, above n 93, at 2.48.

121 Ohm, above n 110, at 1704. Ohm has been bold enough to disregard the concept of ‘personally identifiable information’ completely. He advocates instead for embracing the ever-expanding category, and focus on the risks of harm in specific contexts, weighed against the benefits of free flow of information in those contexts.

122 Green, above n 109; UK Anonymisation Network <www.ukanon.net>. The term ‘identifiers’ is often misunderstood to simply mean ‘formal identifiers’ such as the data subjects name, address etc. But identifiers could in principle include any piece of information, or combination of information, that makes an individual unique in a dataset and as such vulnerable to re-identification.

123 Green, above n 109. The ‘semantic web’ focuses on looking to the meaning of the data as a whole, rather than particular letters or numbers.



towards increasing fusion of offline and online.124 On this basis, there can be no faith in the current definition of ‘personal information’ being able to cater for what is actually occurring in the big data domain.
  1. Repurposing in the Dark: Principle 3 and Unknown Purposes

The question whether big data increases or changes the risk to privacy, is a critical one. The fact that companies do not know in advance what they may discover, creates a tension in applying the Act to current big data patterns. The legitimacy of collecting data for its own sake, as opposed to a specific future purpose, is a grey area in the current framework. By its very nature, big data entails collecting personal information with a blank purpose. This fundamentally cuts against the Act’s first privacy principle which places importance on the purpose connected to the core function of the company.125 Furthermore, the inherent ‘unknowns’ of big data render it difficult for companies to genuinely comply with the requirement of Principle 3, and inform the individual concerned of the purpose for which it is being collected.126 Since the majority of innovative secondary uses have not been imagined when the data is first collected, the question arises as to how individuals can give consent to an unknown scenario.127

Principle 3, which requires the agency to make known to the individual the future purpose of their data collection128 is no longer fit for

124 Acquisiti, above n 32.

125 Paul Roth and John Edwards “Structure and Overview of the Privacy Act” in Privacy Law: Where are we now?” (New Zealand Law Society, May 2013) at

3. NZPA Principle 1(a) requires that information must be collected for a purpose connection with a function or activity of an agency. This prima facie excludes an unrelated linking of that data to a novel purpose which may still have beneficial outcomes.

126 New Zealand Privacy Act 1993, s 6, Principle 3.

127 This issue will be expanded upon in the following Part IV (ii)(a) analysis. 128 New Zealand Privacy Act, Principle 3(1)(a)–(b). Where an agency collects personal information directly from the individual concerned, the agency shall



purpose. This undermines the central role assigned to the data subjects under the current privacy framework. It also threatens the spirit of informational self-determination, which the German Federal Constitutional Court recognised can be crucial to the growth of society as a whole.129 According to the Act’s principles, of purpose,130 collection, 131 and reuse,132 individuals have an opportunity to agree to lawful data collection. It is this unease over user acquiescence to unknown future use and potential data exploitation that prompts closer analysis of the interface which should govern data-sharing initiatives. This pressing issue, which cuts across fundamental contractual, privacy and informational self-determination rights, will be further explored in the final part of this article.
  1. Towards more Progressive Legal Tools

Having preceded the advent of personally predictable information, the Privacy Act is now showing its age.133 Whilst we know the benefits of data-sharing are undoubtedly significant, the legal loopholes enabling companies to capitalise on the expansive nature of the exceptions seems unjustified. Instead of playing catch-up to emerging technological capabilities, New Zealand’s toolkit ought to demonstrate a more progressive approach, and lead the charge in coherent data governance.

take such steps (if any) as are, in the circumstances, reasonable to ensure that the individual concerned is aware of the fact the information is being collected and the purpose for which the information is being collected.

129 Mayer-Schönberger and Cukier, above n 6, at 154; Paul de Hert “Identity Management of e-ID, privacy and security in Europe. A Human Rights view” (2008) 13(2) Informational Security Technical Report 71 at 72.

130 New Zealand Privacy Act, s 6, Principle 1.

131 Principle 3.

132 Principle 11.

133 Christopher Kuner “The Challenge of ‘Big Data’ for Data Protection” (2012) 2 International Data Privacy Law 47 at 47–48.



Whilst it is important to examine the changing scope of information viewed as personally identifiable, and the repurposing inconsistency from a technical standpoint, this can mask a fundamentally normative question: whether the data should, and how the data ought, to be used.

It is therefore encouraging to see this issue coming to the forefront of the legislature’s attention. Since the Law Commission Report’s release in 2011, statements from the Minister of Justice have signalled the “need to develop new ways to achieve trust and privacy”.134 Emphasis has been placed on upcoming reforms, ensuring that the law better reflects the digital age, whilst bringing New Zealand into alignment with its major trading partners.135 The expectation is that these proposals will put stronger incentives in place to ensure the private sector takes data protection seriously.

Effective legislative and regulatory action could place New Zealand at the forefront of big data stewardship and signal the country’s capacity to drive data-led innovation in a principled, privacy-enhancing way. The following parts of this article will further develop this issue, and suggest measures New Zealand could take in this direction.
  1. The Regulatory Remedy: A Data Standards Authority

This section clarifies the justification for a new body to regulate the wider uses of data and the standards that ought to govern data management. It will offer an example of an existing Standards

134 New Zealand Parliament “Judith Collins Press Statement Privacy Act Changes” (Wellington, 28 May 2014) available at

<http://www.beehive.govt.nz/release/privacy-law-changes-strengthen- protection>; Cabinet for Social and Policy Committee, above n 89, at 30.3. Taking an entirely new approach would take New Zealand out of line with major trading partners in the OECD.

135 At 10. The Privacy Commissioner made the comment that the Act must remain internationally acceptable and continue to support innovation and responsible modern business.



Authority and relevant elements that the Data Standards Authority (DSA) could draw upon.136 It describes the anticipated interface with the Privacy Act, and how an Amendment to the Act could enable this body to come to fruition. It then looks at the structure of- and composition of- the DSA, whilst also exploring the advisory role, in particular the oversight of industry-specific codes of practice. The section then looks at possible response mechanisms the DSA could exercise, ranging from infringement notices to pecuniary penalties. It touches on the possibility of overlaying these measures with publicity, and the prospect of compensatory and exemplary damages.
  1. The Call for a New Standards Body

“The time may have come to set up an independent body specifically focused on maximizing

the benefits to New Zealand from data”.137


We should not take the PC’s call for the establishment of a new body lightly. It is a powerful signal that the privacy scene has shifted, and the legislative instrument to tackle these changes needs a rethink. The Commissioner’s recognition that his mandate fails to encompass ‘wider uses of data’ is a pertinent reminder of the danger in neglecting to account for the extending reach of personally identifiable material. Any policy response to this omission must acknowledge that the internet is a domain enmeshed in emerging forms of governance, which are still amorphous. What is needed to help cure the disjunction between the rapidly expanding data network and the laws that govern it is immediate clarification on personal data benchmarks. There is little doubt that the digital ecosystem could benefit from a clarified framework of standards to help guide New Zealand data holders and users towards greater data responsiveness.

136 In contrast to the NZDFF’s proposal of a Data Council, this article will use the terminology of a Standards Authority, to emphasize the standard-based regulatory powers which this body would possess.

137 John Edwards, above n 85, at 3.



The best approach is to provide a transition point towards an international charter for Data Protection and Privacy standards.138 This would take the form of a New Zealand-centric data standards framework, which would ensure that individuals remain protected, data processors embrace their responsibilities, and innovation is not artificially constrained.139 Leveraging New Zealand’s existing architecture and building a framework around this in an efficient regulatory manner would be the most sustainable way to future-proof against big data challenges.140 Although there is a case for delaying major proposed changes to the Privacy Act until the upcoming amendments of the EU Data Protection regime are made official,141 this factor would not have to impact on regulatory measures that are classified as Disallowable Instruments Not Regulatory Instruments (DINRI).142

The European Commission’s recent announcement of ‘Horizon 2020’ and its focus on developing common standards to facilitate the data-

138 International Conference of Privacy Commissioners Madrid Resolution: Joint Proposal for a Draft of International Standards on the Protection of Privacy with regard to the processing of Personal Data (Madrid, November 2009) at 29. New Zealand was one of ten countries who proposed the Resolution for International Standards. 139 Fred Cate and Viktor Mayer-Schönberger, above n 103, at 15.

140 In the same way that the Privacy Commissioner can create subordinate legislation, or Disallowable Instruments that are not legislative instruments (DINLI) through the code creation powers in Part 6 of the New Zealand Privacy Act, this body would have similar powers to create DINLI that pertain to the data standards.

141 Assuming New Zealand wants the best chance at maintaining its ‘adequacy’ status and certainty with what European Standards, New Zealand commentary has been indicating major changes to the Privacy Act should wait until the EU amendments are implemented.

142 Regulations Review Committee “Inquiry into the oversight of disallowable instruments that are not legislative instruments” (July 2014) I.16H

<http://www.parliament.nz/resource/en- nz/50DBSCH_SCR56729_1/2dd6b5922847c918b02457adfb7e83f055a20f35> at 6. Unlike legislative instruments, these instruments as defined by s 38(1)(b) of the Legislation Act 2012 provide greater scope for change and industry-specific tailoring.



driven economy indicates the increasing lean towards this regulatory strategy.143 The European Commissioner’s plan to identify sufficiently homogenous sectors suggests New Zealand should take a similar route in creating a body to provide customised data protection. This would foster a stronger security culture, and help detect and respond to data mismanagement across sectors.144

The establishment of the New Zealand Data Futures Forum (NZDFF) earlier this year demonstrates exactly the sort of thoughtful discussion of data stewardship that is necessary. Moreover, it highlights the call from the business community for more certainty to enable data experimentation within well-understood and navigable boundaries.145 Innovation ironically requires certainty.146 It is clear that the requisite innovation has already begun. We now need to regulate the exchanges of data in a meaningful way and it seems best to begin this process with standard-based architecture.

The most suitable enabling Act for establishing the DSA would be the Privacy Act.147 This would entail inserting an amendment into the Act, in accordance with the Crown Entities Act, echoing the amendment to the Broadcasting Act that established the BSA in 2005.148 It would be appropriate for this amending provision to outline the key principles of data stewardship upon which the standards contained in the codes would be based. It would not present a radical departure from the

143 European Commission, above n 4, at 9.

144 At 11.

145 New Zealand Data Futures Forum (NZDFF) Second Discussion Paper (New Zealand, 2014) available at

<https://www.nzdatafutures.org.nz/sites/default/files/first-discussion- paper_0.pdf> at 15.

146 At 15.

147 New Zealand Privacy Act; Interview with John Edwards, Privacy Commissioner (Mahoney Turnbull, July 2014).

148 New Zealand Broadcasting Act 1989, s 20; New Zealand Crown Entities Act 2004, ss 7 and 200.



current system but rather a reboot of the Information Privacy Principles in alignment with data protection developments that require more nuanced principles. This interface with the Privacy Act would enable the confluence of personal data issues with the structure of an established system designed to endure changes in our digital landscape.
  1. Structure of the DSA
  1. The Data Council

In line with the NZDFF’s proposal of an independent data council to serve as ‘guardians’ of the data ecosystem, the DSA could encompass this form of strategic leadership from a mix of stakeholders.149

In terms of composition, the council could take its cue from recent developments in the domestic policy sphere. The inclusion of a ‘Chief Technology Officer’150 (CTO) would be a valuable addition as a neutral data arbiter.151 The endorsement of a CTO in the NZDFF’s discussion paper, bolstered by support from academics152 and industry leaders, reinforces the value in creating this position to help identify and tackle emerging issues whilst encouraging a secure data environment.153 In contrast to the traditional framework that tends to engender businesses

149 New Zealand Data Futures Forum (NZDFF) Third Discussion Paper: Harnessing the economic and social power of data (New Zealand, 2014) available at

<https://www.nzdatafutures.org.nz/sites/default/files/NZDFF_harness-the- power.pdf> at 16.

150 This concept was first incorporated in the Green Party’s proposed Internet Rights and Freedoms Bill to supplement the role of the Privacy Commissioner and advise Parliament and Cabinet on the challenges and risks for New Zealand’s digital ecosystem.

151 Internet Rights and Freedoms Bill available at

<https://home.greens.org.nz/misc-documents/internet-rights-and-freedoms- bill>. This role has been likened to the Chief Science Advisor who is responsible for advising the Prime Minister on scientific matters.

152 Interview with Hon Michael Kirby (Mahoney Turnbull, 5 August 2014).

153 New Zealand Data Futures Forum, above n 149, at 48.



working reactively on legislative action, this would enable a proactive engagement model to grow between the public and private sectors.
  1. Code creation

Building upon the core principles outlined in the amending provision of the Privacy Act, the purpose of industry-specific codes would be to define best practice around data management. This would build upon the protocol already established in the Privacy Act for relevant industries to issue codes themselves.154 Just as the Privacy Act already allows codes to be less or more stringent than the Information Privacy Principles,155 the DSA codes could provide standards that are tailored to the particular requirements of different sectors operating in the economic and social fabric of New Zealand. Case studies of industry- led co-regulatory pursuits have consistently proven that the collaborative approach can be administratively efficient.156 Combined with the lean towards expanded regulatory mechanisms for resolving market and enterprise related issues, New Zealand is well placed to draw upon these experiences and pursue a code-driven framework.157

In pursuing this consensus-based regulatory method, caution is needed to avoid soft data-sharing rules due to vested input from self-interested industry input.158 The DSA would need to be aware of dubious regulatory commercial commitments that show more “public relations” impetus, than genuine precaution. To ward against this outcome, inclusion of privacy and consumer advocacy groups could be an essential component of the code-creation process. Against this threat

154 New Zealand Privacy Act, s 47(3).

155 Section 46(2)(a)(i).

156 Dennis Hirsch Dutch Treat? Collaborative Dutch Privacy Regulation and Lessons it holds for US Privacy Law (Future of Privacy Forum, July 2012) at 44–45.

157 At 44.

158 At 81.



however, a “game-changing” opportunity159 exists for New Zealand to use ‘co-regulatory’ muscle in creating the new regulatory system.160
  1. Principles to Guide the Data Standards Authority

This section outlines the guiding principles that would enable best data practice to develop. The principles would be reflected in the relevant industry codes, and incorporated into the amending provision of the Privacy Act. This section will propose two central principles that are fundamental to effective data stewardship. After highlighting the tension in strategic maximisation of data, it will assess the principle of prioritising Privacy by Design (PbyD),161 with a focus on a de- identification protocol, as well as consumer friendly privacy settings. This will address concerns of data empowerment, and the underlying problems surrounding consent in the realm of big data. Whilst recognising the potential emptiness of the notice and consent construct, it will explore ways that could help formulate more effective rules of engagement.




159 NZDFF, above n 153, at 48.

160 The White House Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (Washington, February 2012) at 32.

161 ‘PbyD’ refers to Privacy by Design. See Ann Cavoukian “Personal Data Ecosystem: A Privacy by Design Approach to an Individual’s Pursuit of Radical Control” in Mireille Hildebrandt, Kieron O’Hara and Michael Waidner (eds) Digital Enlightenment Yearbook (IOS Press, Amsterdam, 2013) at 96. The objectives of Ontario Privacy Commissioner Ann Cavoukian’s Privacy by Design method, which she developed in the ‘90’s to address privacy needs, are to ensure privacy and personal control whilst allowing organizations to gain a competitive advantage following the seven foundational principles. ‘Radical control’ refers to individuals having the tools to predict the outcomes of their actions when interacting with organisations.


  1. The Overarching Aim of Strategic Maximisation of Data

Simply stated, data minimisation is at odds with the essence of big data.162 An inherent conflict exists in the non-retention impulse mandated by the Privacy Act,163 and the maximisation of data that the big data business model demands. Whilst this chapter cannot delve further into the complexities and possible solutions for reconciling the minimisation versus maximisation struggle, it is important to recognise the tension. Knowing this pressure exists, the question is how to refine and repurpose data in the most strategic way.164

Given the potential ‘pollution’ of stale data, there is a need for structural incentives to streamline data sets. On this basis, stimulating the market for privacy enhancing services that prompt greater engagement in judicious data maximisation should be a core focus in the regulatory solution.
  1. Principle 1: Prioritising Privacy by Design (PbyD) and a De- Identification Protocol
  1. Nimble analytics and the role of the algorithmist

The “architectures of vulnerability”165 around big data are prompting regulatory swings in the PbyD direction. By focusing the first principle on PbyD and embedding privacy in the design specifications of the data lifecycle, weaknesses can be corrected and organisations motivated to show sound data stewardship.166

162 Rubenstein, above n 6, at 5; Tene and Polonetsky, above n 16, at 260.

163 New Zealand Privacy Act, s 6, Principle 9.

164 Edgar Whitley “Towards Effective, consent-based Control of personal data” in Mireille Hildebrandt, Kieron O’Hara and Michael Waidner (eds) Digital Enlightenment Yearbook (IOS Press, Amsterdam, 2013) at 169.

165 William Dutton The Oxford Handbook of Internet Studies (Oxford University Press, Oxford, 2013) at 19.

166 NZDFF, above n 153, at 61.



Despite the contested futility in de-identification,167 the ‘call to keyboards’ is still being heard on the international stage.168 The algorithmist’s ability to create scalable ‘Privacy Enhancing Technology’ is crucial in formulating effective data standards.169 This does not mean an abdication by policymakers, or the DSA, but a recognition that algorithmists have the potential to make or break data protection protocol. In this way, PbyD moves beyond normative spheres of law and best practice, directly into emerging technology and the marketplace.

From a market-driven perspective, PbyD will grow a “vibrant marketplace for privacy-enhancing services” and further economic development.170 Indeed, the World Economic Forum (WEF) attributes PbyD to unlocking the value of data.171 New Zealand policy makers should demonstrate their support of the WEF’s agenda and incentivise organisations to make privacy a key commercial priority.172

167 Lars Backstron “Wherefore art thou r3579x? Anonymised social networks hidden patterns, and structural steganography" (paper presented at the 16th International Conference on the World Wide Web, Canada, 2007) at 181–190. 168 Edith Ramirez, Chair of the US Federal Trade Commission “Data Brokers: A Call for Transparency and Accountability: Opening Remarks” (speech presented to Federal Trade Commission, May 2014).

169 Simone Fischer-Hübner “Online Privacy - Towards Informational Self Determination on the Internet” in Hildebrandt, above n 103 at 137; European Commission, above n 4, at 7. Privacy-enhancing technologies has been defined as a “coherent system of information and communication technology measures that protect privacy without losing the functionality of the information system”; Brill, above n 10. The ‘algorithmist’ is the individual in the company who will understand the use of algorithms and their legal and ethical implications.

170 European Commission, above n 4, at 33.

171 World Economic Forum Unlocking the value of Personal Data: From Collection to Usage (Geneva, 2013) available at

<http://www3.weforum.org/docs/WEF_IT_UnlockingValuePersonalData_C ollectionUsage_Report_2013.pdf> at 4.

172 Claudia Diaz, Omer Tene and Seda Gurses “Hero or Villain - the Data Controller in Privacy Enhancing Technologies” (2013) 74 Ohio St L J at 959; Simone Fischer-Hübner, above n 169, at 9.


  1. The DSA’s clarification on de-identification

As recognised by the NZDFF and the various submissions to their study, a gap currently exists in specifying de-identification techniques expected from organisations. Looking at the UK and Australia, it is evident that New Zealand is lagging behind in establishing clear standards for this technological process.

New Zealand lacks an equivalent to the UK Information Commission Office’s (ICO) disclosure considerations test, which aligns with the UK Anonymisation Network’s resource for best practices in anonymisation of data sets.173 The ICO justifies their effective de-identification protocol on the basis that a complacent approach, alongside an insufficiently rigorous risk analysis, causes inappropriate data disclosures.

On the contrary, New Zealand lacks a published anonymisation protocol. Statistics New Zealand currently does not offer companies seeking to follow its structure a useable framework.174 Accordingly, companies such as Telecom175 have called for publicising this methodology so it can be reviewed and used by the industry. Not only would this enhance awareness of the desirable standard, it would also enable data holders to plan for dealing with re-identification. The DSA

173 Bendert Zevenbergen Ethical Privacy Guidelines for Mobile Connectivity Measurements (Oxford Internet Institute, 2013) at 10; European Commission Article 29 Data Protection 05/2014 on Anonymisation Techniques (Brussels, April 2014) at 25. The Article 29 Working Party was set up under Article 29 of Directive 95/46/EC. It is an independent European advisory body on data protection and privacy. Its tasks are described in Article 30 of Directive 95/46/EC and Article 15 of Directive 2002/58/EC.

174 Statistics New Zealand <www.stats.govt.nz>. Reference is given to collapsing, aggregating, modifying values and suppressing data cells.

175 Spark, above n 50. At the time of receiving the document from Telecom’s legal team, the company had not yet changed the name to Spark.



framework would benefit from taking the UK and Australian examples into account in order to de-mystify the de-identification process.
  1. Core Problems of Data Empowerment: Rules of Engagement
  1. Issue of consent: an empty construct?

In a world of big data, the reality of collection is that we have shifted to a landscape of passive generation and collection. Although the need for consent is clear, it is impractical, if not impossible, for users to give express consent with respect to all collected data.176 Big data thrives on surprising correlations that call into question laws that rely on traditional ideas of notice and consent.177

The backdrop for data-sharing is increasingly complex, as data flows are channelled through dense networks of platforms and applications. Back-handling or ‘downstream’ agreements178 obscure this environment, which is aggravated by the opacity of decisional criteria.

The current model of stating purposes and obtaining data processing consent at the outset highlights an important fault line between law and technology, and the redundancy of the traditional paradigm.179 Indeed, the unknown factors in data repurposing require a workable framework to help alleviate the artificial nature of the consent model.



176 See Part 3 E, ‘Repurposing in the Dark: Principle 3 and unknown purposes’. 177 Paul Ohm “General Principles for Data Use and Analysis” in Lane and others (eds) Privacy Big Data and the Public Good (Cambridge University Press, Cambridge, 2014) at 100.

178 Tene, above n 16, at 261.

179 At 271; Fred Cate and Viktor Mayer-Schönberger, above n 103, at 14; Helen Nissenbaum and Solon Barocas “Big Data’s End Run around Anonymity and Consent” in Lane and others, above n 177, at 60.


  1. Information asymmetries + poor understanding = lack of engagement

The emptiness of the construct not only fails to create an ineffective contractual relationship between the parties, but also establishes an unacceptable power imbalance. This form of ‘engineered’ consent, where an illusion of free choice is proffered, plays to the hands of cognitive biases, which produce suboptimal results.180 Behavioural studies have demonstrated the skewed nature of subjective utility, upon which the data subjects’ decisions are based.181 This is cogently illustrated by a recent performance art experiment where individuals gave away highly granular personal data in return for a Facebook biscuit.182 The immediate experiential gains from ‘free services’ in contrast to the temporal distance of privacy losses, casts a shadow on the authenticity of privacy choices.183

Informational asymmetry is a critical issue, and when linked with intelligibility obstacles, creates an inadequately engaged data subject. Genuine informed consent has been rendered essentially impossible, due to the complicated fine print which deters users and creates social pressure to not appear awkward or confrontational.184 Decoding vague,

180 Jason Millar “A Problem for Predictive Data Mining” in Ian Kerr, Valerie Steeves and Carole Lucock (eds) Lessons from the Identity Trail: Anonymity, Privacy and Identity in a Networked Society (Oxford University Press, Toronto, 2005) at 110.

181 Christopher Parsons “Putting the Meaningful into Consent” (16 October 2010) Technology, Thoughts & Trinkets <http://www.christopher- parsons.com/references-for-putting-the-meaningful-into-meaningful- consent/> .

182 Rob Waugh “People are willing to trade private data for pistachio cookie” (2 October 2014) We Live Security

<http://www.welivesecurity.com/2014/10/02/people-willing-trade-private- data-pistachio-cookies/>.

183 Tene, above n 16, at 261; Joseph Turow and others “The Federal Trade Commission and Consumer Privacy in the Coming Decade” 3(3) J L & Policy for Info and Soc’y (2007) 723 at 724.

184 Mindy Chen-Wishart “Contract Law and Uncertain Terms” (Staff Seminar given to University of Otago Law Faculty, 25 July 2014).



elastic terms about reuse that enables “improvement of customer experience” is not productive.185 The participation deficiency stems from an overriding sense that users are “in the dark” and disabled from transparent and active engagement.186

Yet in terms of combatting information asymmetries, the tide is starting to turn. The establishment of New Zealand’s Broadband Product Disclosure Code illustrates the drive to combat the ‘fog of ignorance’ that can enable unethical use.187 This self-regulatory code, which outlines and compares broadband offerings to customers, could be used as a model to translate to the area of data protection.188 However, the fact still remains that merely forcing data controllers to notify users of the risks they are taking could not only overwhelm them, but fail to nudge individuals into privacy-enhancing behaviours.189

185 European Commission, above n 4, at 34. This refers to the recent French Consumers Group which launched legal action against three of the largest social networks, criticising them of confusing ‘elliptique et pléthorique’ contractual terms; Alina Tugend “Those Wordy Contracts We All So Quickly Accept” The New York Times (online edition, New York, 12 July 2013)

<www.nytimes.com>: Apple Mavericks Privacy and Terms of Service (2 September 2014). This policy was accessed by the author when downloading the latest OS X (10.9.4). This policy was at least half the length of this article and offered an easy way to skip reading the policy and proceed to the “I agree” phase.

186 Yannis Bakos, Florencia Marotta-Wurgler and David Trossen Does Anyone Read the Fine Print? Testing a Law and Economics Approach to Standard Form Contracts (CELS 4th Annual Conference on Empirical Legal Studies Paper, 2009); Brill, above n 10.

187 World Economic Forum Rethinking Personal Data, above n 60, at 7. 188 New Zealand Telecommunications Forum Broadband Product Code (Wellington, 23 October 2013) available at

<http://www.tcf.org.nz/library/d2225da1-d8b2-4e8e-8308- d025091fa2ac.cmr>.

189 Ctrl-Shift “Mapping the Market for Personal Data Management Services” (20 March 2014) <https://www.ctrl- shift.co.uk/home/?CSRF_TOKEN=46c5c5922f666be1ab43e205168a86c64e5 1ec60>.


  1. Principle Two: Data Holders Must Create Consumer Friendly Privacy Settings

“We need a new commercial order in which data subjects are emancipated from systems built to control them and become free and independent agents in the marketplace.190 - Doc Searls


A second principle to guide the DSA is the creation of consumer friendly privacy settings. This requirement would aim to bridge the gap between ineffective command style privacy interfaces and a more desirable form of user engagement. The pivot point for this regulatory ecosystem must hinge on the concept of ‘user-centricity’.191 Opportunities do exist for liberating individuals from ‘antihuman’ systems that treat users as mere gadgets.192 Provided there is a genuine shift towards a more humanised paradigm where the user becomes the nucleus in the ecosystem, then the goal of creating more consumer- friendly privacy policies may be within reach.
  1. Informed consent

A core facet of the final principle concerning user-centricity is the ability to meaningfully ‘inform’ the data subject.193 To achieve this, there needs to be a shift towards conceptualising consent in a self- determinative way. The mandate to provide privacy information to users in a form that clarifies the nature of the data capture, reuse and downstream sharing is becoming increasingly critical.194 In recognising

190 Doc Searls “The Intention Economy: When Customers Take Charge” (Harvard Business Review Press, Boston, 2012) at 1: Ctrl-Shift “The New Personal Data Landscape” (22 November 2011) <www.ctrl-shift.co.uk>. 191 Rubenstein, above n 6, at 9.

192 Jaron Lanier “You are not a Gadget: An apocalypse of self-abdication” (Knopf, New York, 2010) at 26; Ann Cavoukian, above n 264, at 90; Ctrl-Shift, above n 305.

193 Helen Nissenbaum and Solon Barocas “Big Data’s End Run around Anonymity and Consent in Julia Lane and others, above n 177, at 59. 194 Simone Fischer-Hübner, above n 169, at 134.



that informed consent may no longer be a match for the challenges posed by big data, the DSA measures should be more than just operationally-focused.195 In pursuing this objective, we ought to transcend the notion that ‘shedding sunlight’ on personal data arrangements is adequate, and instead strive towards ensuring the user’s ‘ammunition’ is more fitting for what Acquisiti hails the ‘data gunfight’.196

One way in which informed and contextually-driven consent could display more granularity is through sliding scales.197 In this respect, how the data is protected needs to be weighed against the sensitivity of the information collected. For the privacy settings to capture the texture of data stewardship, the decisional criteria behind data management choices also ought to be elucidated, including perhaps the disclosure of algorithms.198 Greater exposure of how decisions are weighed would help users gain trust in the entities they interact with, and greater insight into the variables that influence data-sharing.199

Achieving true informed consent also requires evaluation of downstream sharing agreements. The challenges posed by the chain of data stakeholders involved in the data enterprise make this an important practice to bring to the attention of users.200 Delving further into this issue prompts the question of when the data controller’s obligation to inform should end. Should the duty to provide ‘informed consent’ be rendered complete in terms of the data that is explicitly

195 Nissenbaum and Barocas, above n 193, at 63. The distinction between operationalising informed consent and informed consent itself ought to be recognised.

196 Rubenstein, above n 6, at 8; Acquisiti, above n 32.

197 Paul Ohm “General Principles for Data Use and Analysis” in Lane and others, above n 177, at 105.

198 Simone Fischer-Hübner, above n 169, at 133.

199 Carolyn Nguyen “A User-Centered Approach to the Data Dilemma” in Hildebrandt, above n 103 at 23.

200 Nissenbaum and Barocas, above n 193, at 60.



recorded? Or should the data controller adopt a more encompassing approach, explaining what further information the organisation may glean? There appears to be a strong case for arguing that consent should not only cover the information that can be directly derived from it, but also information from sophisticated analysis, including aggregation with other contextual or personal data.

In response to the trend towards increased downstream sharing, the Privacy Commission recently released privacy policy Guidelines for App Developers.201 The guidelines place emphasis on integrating privacy from day one, which entails raising awareness of whether the data is being ‘funnelled’ to downstream third parties.202 Whilst recognising the correlations between app developers, users and data miners, the announcement of these standards foreshadows the potential extension of the privacy benchmarks beyond the app domain to wider instances of data collection and manipulation.
  1. Live consent

A final feature of Principle 2 that will help foster a culture of user- centricity is the aspect of ‘living informed consent’.203 Identified as a key strategy by privacy commentators and industry bodies, the submission from Spark to the NZDFF also highlights the need for dynamic privacy.204 Rethinking privacy settings towards creating a living

201 Kate Fay “Fitness Apps Can Help You Shred Calories and Privacy” (May 2014) Adage <www.adage.com>. Recent studies by the US Federal Trade Commission reveal the extent of downstream sharing, with the sample study of twelve health and fitness apps disseminating personal data with 76 third parties. 202New Zealand Office of the Privacy Commissioner, above n 313.

203 Greenwood and others “The New Deal on Data: A framework for Institutional Controls” in Julia Lane and others (eds) Big Data, Privacy and the Public Good (Cambridge University Press, Cambridge, 2014) and others at 201.

204 Spark, above n 50, at 8.



conversation between data holder and data subject re-envisions the traditional notion of rigid preferences.

To respond to the need for consent measures that value the dynamic pace at which data is being ‘upcycled’ and disseminated, the DSA ought to focus on the preference functions around personal data.

Firstly, it is plausible for coverall consent to be offered at the beginning of the data stewardship process.205 Advocates for consent regimes of this genre challenge the prioritisation of active permissions, arguing that the value in live consent is overstated. The contention from data evangelists is that more data being used in unrestricted ways will always be beneficial, if only for reasons to be determined at a later date.206 Privacy scholar Omer Tene argues an over-emphasis on consent may stifle innovation, and that neglecting to solicit consent actually results in more positive outcomes for all parties involved.207 He cites examples such as Facebook’s proactive News Feed Feature launch, and Google’s ‘wardriving’208 to map out Wi-Fi networks as evidence of this. In the case of Google’s geo-location orientated exploits, had they provided the choice for users to opt their routers out of the wardriving campaign, it is doubtful that many would have done so, considering the recognised value of Google’s data use.209 These cases highlight the potentially regressive effect of consent-based processing, which may ultimately result in less utility for data users.

205 Edgar Whitley, above n 164, at 172.

206 Claudia Diaz, Omer Tene and Seda Gurses, above n 172, at 959.

207 Tene and Polonetsky, above n 16, at 262.

208 “Wardriving” refers to the act of searching for Wi-Fi wireless networks by a person in a moving vehicle, using a portable computer, smartphone or personal digital assistant.

209 Kevin O’Brien “Google Allows Wi-Fi Owners to Opt Out of Database” New York Times (online ed, New York, 15 November 2011)

<http://www.nytimes.com/2011/11/16/technology/google-allows-wi-fi- owners-to-opt-out-of-database.html>.



Nonetheless the coverall approach has been condemned as excessive. Requiring data subjects to be ‘stuck’ with the initial choices would fly in the face of a living dialogue and emasculate the concept of informed consent.210 This wholesale attitude diminishes the value of informed consent because it requires notice that fails to delimit future uses of data and its possible consequences.

Conversely, live consent would enable users to engage in real time through privacy triggers, an element that the NZDFF focused on in their final recommendations paper.211 This system could use appropriate notification standards from the recent New Zealand App Guidelines. For instance, if a user wanted to change privacy settings themselves, they should be provided with information regarding who will be able to view their data after the change.212 Since the timing of notification is also critical, icon-based notifications to indicate when vital attributes like geo-location data is being mined, could be useful. This is currently being explored through the ‘VRM Project’ at Harvard’s Berkman Centre for Internet and Society. This project is pursuing a vision where an individual is in “complete control of her digital persona and grants permissions for vendors to access it on her own terms without vendor lock in”.213 This echoes the NZDFFs aim of more fine-grained control over personal profiles to achieve mutual gain in data exchanges. Striking the balance between open and active communication channels, and respect for an appropriate level of distance in the data relationship is challenging, and cuts to the core of achieving live consent.


210 Mayer-Schönberger and Cukier, above n 6, at 154.

211 New Zealand Data Futures Forum (NZDFF) Third Discussion Paper, above n 149, at 20.

212 Priv New Zealand Office of the Privacy Commissioner Guidance Note for App Developers 5 Point Checklist (Wellington, 24 July 2014).

213 Tene, above n 16, at 266.



Given the possibility that consent-based data acquisitions may not have been fully ‘informed’, updated prompts seem desirable. Looking at models of consent in the health sector, the Ministry of Health ‘Guidelines on the Use of Human Tissue for Future Unspecified Purposes’ provides useful insight into promising methods.214 The option to “recontact the donor in order to gain further consent” is indicative of the ‘living consent’ approach. This mode of consent, combined with providing practical insights in real time, is a suitable cross-industry impulse for the DSA to draw upon.215 Just as the guidelines offer indications about the nature of research carried out on tissue samples and implications for the donor, the live consent standards imposed by the DSA could require data holders to be notified on a similar basis. Empowering the user to know the types of proactive upcycling from the beginning could lead towards a more informed user base.

In response to concerns surrounding the intrusiveness of a live notification-based consent model, opt-out options must also be explored. To avoid initial bad bargains having long-term consequences, users should be able to retract consent and halt future use of their data at any point they feel is appropriate. Although we may presume the ability to opt-in and out is a reality, the harsh truth is that most privacy policies operate on a model of endurance. Apple’s approach to the use of their Operating Systems (OS) is one relevant example. In response to the latest Mavericks OS 10.9.4 offering, users are presented with the option of termination, upon which the license becomes redundant. However, the express limitation enabling downstream sharing to survive such termination allows Apple to ensure that the dissemination of user’s data is a permanent one.216 This kind of agreement highlights

214 Ministry of Health Guidelines on the Use of Human Tissue for Future Unspecified Purposes (Ministry of Health, Wellington, March 2007) at 9. 215 New Zealand Data Futures Forum (NZDFF) Third Discussion Paper, above n 149, at 6.

216 Apple Mavericks Privacy and Terms of Service (September 2 2014).



the current artificiality of consent to data-sharing. Moreover it reaffirms how genuine consent is being constrained by the mainstream acceptance of wholly submitting to these less than desirable privacy policies.

It is against this backdrop that the NZDFF have recommended bolstering the right to opt-out. There is an obvious need for more clarity surrounding consent arrangements. The NZDFF suggests incorporating the right to opt-out in standard terms and conditions for consent to data services. While there are technical limitations to this, opting out could also be accompanied by ‘best-efforts provisions’ to delete all the relevant data.217 Whilst it is not in the ambit of this article to investigate the related issue of the ‘Right to be Forgotten’, this is a pertinent question that warrants serious discussion regarding its impact on information privacy law.

~~~~~

The degree to which consumer friendly privacy settings can prevail as the norm which data holders must abide by depends to a large extent on the self-awareness of users. Once appropriate standards are in place, the responsibility is on users to maximise the dynamic interface that has the capacity to stimulate a living dialogue. Companies can extensively visualise, clarify and inform users, but if the data subject remains disengaged, then consumer friendly privacy settings will fail to get traction. For consent to be truly ‘live’, the continuing conversation must be valued. Like any fruitful relationship, this requires active listening to ascertain what each stakeholder wants out of the data exchange. Thus, the new guidelines must be founded on user-centric principles that balance regulatory certainty with flexibility so the dynamism of data can be accounted for.

217 New Zealand Data Futures Forum (NZDFF) Third Discussion Paper, above n 149, at 71.


  1. Conclusion

“In God we trust. All others must bring data.”218


Big data has come. And it is trampling all over privacy law. The nuanced ways in which data analytics operate, and the vigour with which the technological landscape is changing, presents a unique time in societal development. In grappling with the question of how data can operate for us, and not upon us, this article set out to explore how to shift New Zealand’s privacy landscape towards more progressive legal tools.

The regulation around data stewardship is critical. This is evidenced by the national conversation which has already begun. Discussion papers from the NZDFF and the PC are offering support for changes in the data protection sphere. This article has identified a fresh regulatory framework. With this body in mind, New Zealand’s ability to navigate the information industry through relevant privacy protections is more assured. In light of the current power imbalances between data holders and users, a Data Standards Authority could be truly valued. This body would be well placed to provide a baseline of best practices for data stewards and downstream ‘upcyclers’, whilst offering robust accountability measures to encourage organisations to engage in a more responsible and responsive data-use ecosystem.

Paying heed to the notion that “cyberspace has no intrinsic nature. It is as it is designed,”219 the strategy for overcoming the inadequacies of New Zealand’s data protection law has focused on the formulation of guiding principles. I have suggested several that the proposed amendment to the Privacy Act could encompass, and which would lay the groundwork for the formulation of industry-specific codes. These

218 Hastie and others The Elements of Statistical Learning (2nd ed, Springer, New York, 2009) at vii.

219 Lawrence Lessig Code 2.0 (Basic Books, New York, 2006) at 317.



not only stress the operational side of data protection and effective PbyD techniques, but also the behaviour-driven elements concerning user empowerment and engagement models. New Zealand need not only rely on regulatory reform to achieve its data protection goals – it can, and should, take advantage of emerging business models in which firms decide to empower consumers and enhance individual control over personal data.

We would be wise to avoid a tragedy of the data commons,220 in which individualistic and exploitative pursuits by data holders override and deplete the potential value of the data resource. Not only would this be contrary to individual privacy rights and the orientation of data protection towards providing transparency, it would also be contrary to the long-term interests of society. There is little doubt that data is a highly strategic asset that has the power to be the new engine of our increasingly digitalised economy.

The cultural commentator McLuhan recognised that “we shape our tools, and our tools shape us”.221 Careful carving of this privacy toolkit, and close attention to the form our privacy messages take, will enable a sharper, more robust data ecosystem. The chosen tools to govern personal data will have a profound impact on New Zealand’s capacity to be a world leader in delivering a trusted digital environment where the big data benefits can be realised. Big data can be harnessed to serve the public good. The only limitation will be deciding to what extent we want our digital future to be guided by an ever-changing petabyte- driven compass.



220 Jane Yakowtiz “Tragedy of the Data Commons” (2011) 25(2) Harvard J L & Tech at 4.

221 Marshall McLuhan “Understanding Media: The Extensions of Man” (McGraw-Hill, New York, 1964) at xi.


NZLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.nzlii.org/nz/journals/NZLawStuJl/2015/4.html