Investigators found thousands of images and videos of terrorist content, child sexual exploitation material, animal cruelty, bestiality and gore on electronic devices seized from a teenager last autumn.

In its annual transparency report, the Department of Internal Affairs’ Digital Violent Extremism Team released details about Operation Flare, an investigation that ran from August 2022 through to the arrest of the young person in autumn 2023.

Unbeknownst to their family, the teen had been radicalised online by extremist content and interactions with other extremists. While their family thought they were online gaming, the person spammed vulnerable online communities with illegal, harmful content and gathered a trove of footage and images, including of extreme sexual homicides.

The operation began with a tip-off from an online platform, which had detected a New Zealand-based user uploading the footage of the Christchurch terror attack.

A little while later, the National Centre of Missing and Exploited Children – which reports child sexual exploitation material to law enforcement around the world – made over 60 referrals to the violent extremism team, “regarding the same user uploading and sharing ‘cross-genre’ objectionable content, which included images and videos depicting the sexual exploitation and abuse of children, extreme sexual violence (including homicide) and violent extremist material across a range of different online platforms”.

The person in question was found to be “prolific” in distributing this content, including to other young people and in spaces frequented by youth. Following a “lengthy investigation”, the investigators raided the person’s residence, executing multiple search warrants and seizing multiple electronic devices that belonged to them.

These were the devices that were found to contain thousands of images and videos, some of which was shared with partner agencies. The family said they had “no prior concerns relating to their online activity, or their collection and distribution of harmful materials, and the activity came as a major shock to them. They thought the individual was merely online gaming, although their online time was unsupervised”.

The person is now facing charges of possessing and distributing objectionable content before the Youth Court, a spokesperson for the department told Newsroom.

The case was part of a pattern of increasing radicalisation of minors globally and in New Zealand, the department said. While previous such situations were usually limited just to terrorist and violent extremist material, more individuals are now seeking out a “substantial variety of objectionable content”, including child exploitation, gore and animal cruelty.

Operation Flare was just one case study exemplifying the violent extremism team’s work across 2023.

The Israel-Gaza war, as well as other high-profile conflicts last year, saw the “weaponisation of war footage to serve extremist purposes”, the department reported.

“We received a total of 83 URLs referred from the ongoing wars and conflicts by New Zealanders concerned that the content in question could be illegal. This content is often brutal and confronting, however, much of this content was not assessed as reaching the threshold of objectionable under the Classification Act because (while often extremely violent) the material stopped short of clearly supporting, or tending to support, violent extremism.”

Several publications relating to the wars in Gaza and Ukraine were passed on to the Classification Office, which determined only a small number of them were illegal.

In another instance, researchers at the international NGO Counter Extremism Project referred 33 web links of content relating to the Christchurch terror attack footage, hosted on a Russian-owned video platform based in a European country.

Take-down notices for 26 of the links issued to the platform were rejected, but the New Zealand Police’s liaison officer at Europol briefed the country’s law enforcement on the issue and 19 of the webpages were removed. Five that “gamified” the terror attack were kept online as they weren’t thought to be illegal under the country’s laws and the remaining two are still under investigation.

Overall, 886 links were referred to the extremism team last year, of which around two in five were deemed to be illegal content. This was a 25 percent increase in referrals and came as the department itself scaled down its proactive investigation work. In 2022, nearly 300 links were identified by the department. Last year, that number fell to just 38.

The department also “sharpened its focus” to acting solely on content that is likely illegal. In past years, the team also informed platforms of “lawful but awful” content which may breach private terms and conditions.

Nearly half of the links referred to the department involved white supremacist content. This content was most common on TikTok, Gab, Instagram and Telegram while Twitter links were more mixed. Twitter was the most commonly referred site, with 261 links. TikTok, in second place, had 122.

TikTok was one of the most responsive platforms, removing 81 percent of the content in question. Twitter and Telegram removed more than half of the webpages.

Join the Conversation

1 Comment

  1. Rather than trying to keep things under cover, it would be helpful to get them into the open so citizens can read them for themselves. Case in point is the so-called ‘manifesto’ of the Christchurch Mosque killer. It’s been hidden away when it should be available to read and digest and try to understand its appeal. The content may surprize some of us. And it must be the whole manifesto, nit a redacted version.

Leave a comment