Terror in Chch
Alt-right-delete: Stopping shooting video’s spread
Behind a Government agency’s campaign to try and scrub the internet of the Christchurch mosque shooting video. David Williams reports.
Years ago it monitored dairies and shops to ensure pornographic magazines and videos were displayed correctly. Its act hasn’t changed since 1993.
So how did a small Department of Internal Affairs unit, with about 30 people, cope when an alleged killer live-streamed last Friday’s Christchurch shootings on Facebook, and the footage rebounded around social media and into sinister spots on the internet?
Well, people have already been charged, after Chief Censor David Shanks made it illegal to watch the attacker’s 17-minute video by classifying it as “objectionable”.
Two men, aged 44 and 18, have appeared in the Christchurch District Court on charges related to sharing the helmet-cam video of the shooting at the Al Noor Mosque. Philip Neville Arps and the teen, who has name suppression, are due to appear again next month. A Masterton woman has also been arrested but, according to reports, hasn’t been charged.
Department of Internal Affairs director of digital safety, Jolene Armadoros, says it has received about 160 complaints – the first last Friday night – related to the live-streaming footage, and that number’s growing. Complaints range from seeing offensive material on online platforms, content hosts not acting fast enough to pull the video down, and people dobbing others in for offering to send them a copy.
“There are people who will be receiving warnings as a result of this, and some people where there will likely be some stronger action taken.”
Armadoros says it takes more seriously people who are “actively trying to share and encourage” the footage’s publication. The department isn’t interested in pursuing the broader public. “I don’t want people to be freaked out that the 17-year-old that accidentally came across this on Friday is going to receive a knock on the door.”
Big tech criticised
The department (DIA) works internationally and with a host of local agencies, including Police, which has a high-tech crime team, NZ Customs, CERT NZ, and independent online safety organisation Netsafe. Netsafe’s executive director Martin Cocker says it’s received about 250 reports related to the mosques attack, footage of which has ended up on major social media platforms like Twitter, Facebook, and YouTube.
Public debate has centred on the tech giants, which have been lambasted – and urged to stop live-streaming – in the wake of the attack. Disturbingly, there have been millions of attempts to share the footage on the major social media platforms.
Facebook, the social media giant on which the video was live-streamed, said in the first 24 hours it removed about 1.5 million videos of the attack, 80 percent of which were never seen. However, some of New Zealand’s biggest companies have suspended advertising on some social media platforms until they better deal with the issue.
Netsafe’s Cocker said if a New Zealander complained to one of the big platforms about seeing a copy of the video, or a link to it, that can lead to the removal of thousands of copies. “It’s worth reporting,” he says.
“Our hope and our objective is that this content which is illegal won’t accidentally pop up on New Zealanders’ internet searching and internet lives, and that only people who are dedicated to go search it out would be able to find it.”
“There are people out there trying to organise other attacks, whether or not that’s in New Zealand or America or whatever, who knows.” – University student
Newsroom has spoken to a University of Canterbury student who complained to DIA about a particular web platform, that we’ve chosen not to name so curious people don’t go looking.
There’s a murky part of the internet where online gamers, in particular, chat to each other, anonymously, on poorly regulated private servers. (DIA’s Armadoros says: “There are means of, sometimes, identifying people on those platforms.”)
It’s well-known as a haven for white supremacist, alt-right hate speech. “It was the first place I went to,” the student says.
She found multiple appreciation threads of the shooting on notorious platform 8chan, with people sharing the footage, and making memes about it. This was an hour after the shooting, while she was in lockdown in the CBD.
A link to a different chat channel server encouraged members to view the footage. There were 3000 people logged in. “There are people out there trying to organise other attacks, whether or not that’s in New Zealand or America or whatever, who knows.”
It’s terrifying, she says, and has been happening for a while.
Facebook said the live-stream was viewed fewer than 200 times during the live broadcast. It was first reported 29 minutes after it began, and viewed about 4000 times before being removed from Facebook. But a link had already been posted to a file-sharing site.
The Canterbury University student logs in to the channel from her computer: “I’m just scrolling through a thread now and it’s, like, swastikas, people blaming Jews, pictures of Hitler.” The difficulty, she says, is it’s presented in a joking, ironic tone, which makes it hard for algorithms to clean up.
Christchurch researcher of the alt-right, Ben Elley, wrote in the NZ Herald that niche internet humour is designed to shock “but may also serve to acclimatise young viewers to casual racism and violence”.
New York Times writer Charlie Warzel observes: “As terrifying as the violence itself is, so, too, is how well the online community worked in the gunman’s favor. This may be our new reality. Not only has conspiratorial hate spread from the internet to real life; it’s also weaponized to go viral.”
The shooting video autoplayed on some social media feeds before people, including school children, knew what they were seeing. A Hawke’s Bay parent said on Twitter half a school classroom of 12-year-olds watched the video last weekend.
Is the video still out there? Yes, Netsafe’s Cocker confirms.
What to do if you’re distressed
The National Telehealth Service, 1737, has contracted an extra 80 psychologists, counsellors, and experts since last Friday, beyond its mental health and addictions team’s usual staff of 50 people.
Up till 7am on Tuesday, it had delivered 2354 phone and online counselling sessions. Phone calls averaged about 40 minutes. The service usually averages 200 enquiries a day. Chief executive Andrew Slater says its frontline workers estimate about 90 percent of the calls relate to the Christchurch shootings.
Those who watched the live-stream footage are “virtual eyewitnesses”, he says. “People who have seen that, that are distressed by it, really should get in touch with us, or a professional, as soon as possible.”
The service’s mental health manager, Mel Grant, says distressed people can call any time – but the earlier the better.
Clinical psychologist Ian Lambie warned people not to watch the video. He told the NZ Herald it was traumatising to watch. “It’s not going to help. And as well as being unhelpful it’s voyeuristic and totally an awful thing to do.”
DIA’s Armadoros, the director of digital safety, says there’s no benefit to anyone seeing it – in fact, it’s harmful to watch. “What we’re trying to remind people of, too, is actually we’re human.”
She quickly scotches a narrative going around that censors are trying to “hide” the video despite public interest.
“Actually, the facts around the event are public. What’s happened is bad and it’s known. In no other crime type would you expect it to be somebody’s right to access the content.
“If somebody is raped, no one starts to complain and say, why didn’t I see the footage of that? When someone breaks into a home or commits an act in public that’s caught on CCTV, people don’t publicly demand that they have access and the ability to hold that footage.”
It’s a high bar, but video met it
Last Friday afternoon, Armadoros’ team quickly determined the shooting video probably met the legal test for being objectionable. That is, it depicts the infliction of serious physical harm or significant cruelty, it’s demeaning or degrading, and promotes acts of terrorism.
“We know the nature of the internet is, sadly, you’re never going to stop this thing from having a footprint, but we can work really hard to prevent it from having as big a footprint as it could have had and to mitigate the sharing of it.”
Over the weekend, DIA contacted the technology industry and enforcement agencies overseas, sending the message the footage was likely to be banned and it needed to pull down videos. (Locally, internet service providers worked together to blocked access to webiste that didn’t respond quickly enough to requests to take down the video.)
Only 14 of Armadoros’ 30-strong team are dedicated to objectionable content, but most have been diverted to deal with the terrorist’s footage. She says her unit has responded well, but no agency was truly prepared for such a large-scale event.
That’s not meant as a slight on the efforts to date, she says. “We didn’t expect it, we don’t see this happening in New Zealand. As our prime minister said, this is not us.”
Netsafe’s Cocker says New Zealand agencies “scramble pretty well”. “We weren’t caught napping but I don’t think people were as well prepared for something of this size and magnitude as we would like. No doubt there’ll be some changes.”
The Films, Videos, and Publications Classification Act 1993 is ripe for review, considering when it was enacted the department was tasked with inspecting movie theatres and dairy magazine racks. Now its work revolves around websites, apps, and things shared through internet.
It’s “absolutely” time to review the Act, Armadoros says. “We’ve been in conversations about that the past few months anyway.”
Internal Affairs Minister Tracey Martin says she’s talking to other ministers about when the regulation of harmful material on the Internet, including social media, can be reformed.
“A major problem is that the current media content regulation regime is outdated – being developed pre-Internet – applied inconsistently, and not fit for purpose. It needs to be modernised to reflect the shifting ways New Zealanders now use and interact with media content.”
New Zealand’s regulations try to prevent harm to consumers and subjects of social media content. But Martin’s especially concerned about evidence showing children and young people are at risk of psychological, physical and emotional harm from viewing inappropriate content on social media.
Cocker says regulations have to be reviewed to determine what should be made an offence. Staff levels will have to be boosted. “Our cyber crime-fighting capability is pretty stretched already, so if you give them more cyber crimes to fight it’s going to have to be supported.”
Following that is the enforcement itself. “There’s quite a bit of work to do before there’ll be any real action.”
Chasing the shooting video has highlighted a conflict between law and regulation, Cocker says, “and multinationals who followed their own rules, or rules made in different jurisdictions”.
Improvements could be made, he thinks, by emulating the success of agencies’ international cooperation to track down child sexual abuse material. “I think everybody will be keen to review and talk about how we might react to one of these mass online safety events in the future.”
Armadoros is clear hate speech and terror-related material need to be bigger priorities for her unit. “Expect to see some real traction and response over the coming weeks.”
WHERE TO COMPLAIN
National telehealth service, phone or text 1737
This story has been updated with comment from Internal Affairs Minister Tracey Martin.
Help us create a sustainable future for independent local journalism
As New Zealand moves from crisis to recovery mode the need to support local industry has been brought into sharp relief.
As our journalists work to ask the hard questions about our recovery, we also look to you, our readers for support. Reader donations are critical to what we do. If you can help us, please click the button to ensure we can continue to provide quality independent journalism you can trust.