technology

In association with

What sovereignty means in a digital world

For Election 2020, Bernard Hickey begins a three-part series on the digital policies needed to bolster New Zealand’s resilience, protect our sovereignty and reduce inequality in the wake of the Christchurch attacks, foreign interference in elections, the Covid-19 pandemic, America’s tech war with China, and state-sponsored cyber-attacks.

There have been three moments in the past 18 months when it dawned on New Zealand that it had lost effective control over large parts of national life in ways we could never have imagined in the days when sovereignty was all about wars, invasions and peace treaties.

Now sovereignty is all about which platform you get your information from, and run your business or school from, where it is hosted, what is written into the algorithms it uses and who is hacking into it. When we want to keep our data safe and private, to fix the holes in our tax base, and communicate with our own government, we now have to negotiate with tech CEOs or non-existent help desks, rather than other nation states.

So what would taking back that sovereignty look like? And how could we do it?

Here is the first of those three moments, and what policies could be used to address them.

Moment 1: March 15, 2019

We all remember where we were that Friday afternoon and how we felt when, somehow, snippets of the live-streamed video of the Christchurch attacks snuck into our news feeds on Facebook, Twitter, Instagram and all the rest. And onto the phones of our children as they left school. It felt as if the whole nation had suddenly been violated and was under attack.

Vodafone CEO Jason Paris, then Spark CEO Simon Moutter and Two Degrees CEO Stewart Sherriff certainly felt they needed to take drastic action to reclaim some control and limit the damage that afternoon. They jumped on a conference call to erect some sort of common defence because, combined, they had the ability to essentially turn off the Internet at its source and stop the most enormous hate crime being committed through their networks to their customers.

For the briefest moment, they considered turning off Youtube and Facebook that afternoon. They had the power, given they controlled the vast majority of New Zealand’s mobile and broadband networks, but did not have a precedent, or what they felt was a mandate to act on behalf of the nation. Instead, they only turned off 4Chan and 8Chan, which were the original sources and distributors of those video snippets and who hosted the gunman’s enablers. By then, the video was metastasizing across the Internet and down into all its nooks and crannies. It took weeks for Youtube and Facebook to eventually scrub them out.

Moutter, Paris and Sherriff felt turning off these platforms would have been a step too far. They were too intertwined in how we live now. Just as during the Christchurch earthquakes, many New Zealanders now use the likes of Facebook, Messenger and WhatsApp to find and reassure relatives and friends, and to coordinate a response. The horse had bolted.

“Censorship is something we take incredibly seriously. It’s not our job to decide what people want to watch, but in this instance it was an extreme act of terrorism and we took extreme action in return,” Paris said at the time.

But it was clear that afternoon, New Zealand had to do something to reclaim control. It had lost sovereignty.

Chief Censor David Shanks banned the video in New Zealand and the attacker’s manifesto almost a week later, but by then the damage had been done, and both items remain accessible in various forms on the internet.

Prime Minister Jacinda Ardern launched the global ‘Christchurch Call’ to get the big tech companies to voluntarily clean up their act.

“We cannot sit back and accept these platforms just exist, that what is said on them is not the responsibility of the place they are published,” Ardern said in Parliament the week after the attack.

“They are the publisher not just the postman. There cannot be the case of all profits and no responsibility,” she said.

Yet Facebook in particular has failed to clean up its platform, as Marc Daalder reported in depth here earlier this year.  It refused to shut down the livestreaming facility used by the Christchurch attacker and has repeatedly refused to shut down or control the myriad groups spreading misinformation or just outright lies on issues ranging from the Qanon conspiracies to 5G, Covid-19, 1080 and the vaccination.

So what should be done?

Andrew Chen is a research fellow at Koi Tū, the Centre for Informed Futures at the University of Auckland, and has just edited a book about these issues -- 'Shouting Zeros and Ones: Digital Technology, Ethics and Policy in New Zealand'.

“I think that we need a government agency and/or Minister to say ‘misinformation or disinformation is in my portfolio’ so that we can get some good resourcing towards developing longer-term policy responses about these challenges,” Chen said.

“It's not about fact-checking individual pieces of information (or establishing a "Ministry of Truth"), it's about building the capability for individuals to be resilient to misinformation. We need more education about these issues to build long-term capacity to respond.”

Chen sees the need for better, more reliable information to be distributed so readers and viewers are ‘inoculated’ against the conspiracy theories that abound.

“When there is a vacuum, information floods in to fill that gap. People will accept information that fits their assumptions and priors, and the particular misinformation we're seeing at the moment hits that balance of being weird enough but also convenient enough to fit that gap,” he said.

What Government could do

Currently, the Government itself is a heavy user of Facebook to hold livestreamed events and distribute public messages, often paid for with advertising. Many government agencies paused their advertising on Facebook in the wake of the Christchurch attacks, but soon moved back because they believe ‘that’s where the people are.’

The Prime Minister still uses videos on Facebook to distribute her own messages direct to voters and she pushed back against calls for the Government to copy Stuff’s recent decision to abandon official use of the platform.

Catalyst IT CEO Don Christie believes the Government would be better off using local, open-source based platforms as the basis for communications to avoid becoming dependent on an unaccountable operation that may not have the same interests as New Zealand citizens. It could, for example, use open source video streaming platforms hosted locally as a base.

“If we combine a bit of money, we can create a streaming service ourselves. It can secondary stream into YouTube and Facebook if we want it to. But we need that primary little bit of infrastructure to be under our control,” he said.

If Facebook did not act over election interference, "we can chop it off, which we can't do at the moment because we haven't got any of that infrastructure.

“They'll argue that they've got to be where people are. But the point is that you don't have to not go there. You just don't have to force everyone to go there.”

Key policies:

The appointment of a Minister to counter mis-information, with funds to improve education and alternatives, particularly through local media.

The Government should wean itself off Facebook and Youtube and build local infrastructure

The Government should adopt Australia’s push to force Google and Facebook to pay copyright fees to local news providers for news and information put on their feeds and search engines.

A panel discussion

The following is a recorded video panel discussion on August 25 to go with this series of articles. Bernard Hickey moderated the panel, which included the following members:

Marianne Elliott is a co-director of The Workshop, which is an independent thinktank and consultancy group based in Wellington. She led the creation of the ‘Digital Threats to Democracy’ report for the Law Foundation last year and The Workshop has also produced reports on Digital Divides and Online hate and Offline harm.

Andrew Chen is a research fellow at Koi Tū – The Centre for Informed Futures at the University of Auckland. He edited the book just published through Bridget Williams Books called Shouting Zeros and Ones: Digital Technology, Ethics and Policy in New Zealand

Anjum Rahman is a project leader for the Inclusive Aotearoa Collective Tāhono. She is a Muslim Community Leader and human rights activist based in Hamilton. She co-wrote a chapter on reducing online harm in Shouting Ones and Zeros and is a chartered accountant in her spare time.

Caleb Moses is a Māori data scientist working for Dragonfly Data Science from Auckland. He is focused on using AI and machine learning, including building the first speech-to-text algorithm in Teo Reo Māori.

Don Christie is the co-founder and CEO of Wellington-based IT group Catalyst, which works using open source software and technologies on projects throughout Governments and the economy.

Catalyst IT is a foundation supporter of Newsroom. 

(Article updated to include video of panel discussion)

Help us create a sustainable future for independent local journalism

As New Zealand moves from crisis to recovery mode the need to support local industry has been brought into sharp relief.

As our journalists work to ask the hard questions about our recovery, we also look to you, our readers for support. Reader donations are critical to what we do. If you can help us, please click the button to ensure we can continue to provide quality independent journalism you can trust.

With thanks to our partners