Week in Review
The pandemic is Facebook’s ultimate test - and it’s failing
Opinion: After the 2016 American election and the March 15 terror attack, Facebook pledged to rid its platform of misinformation and harmful content. The pandemic has put these commitments to the ultimate test - and so far, Facebook is failing, Marc Daalder argues.
On April 7, after viewing a television news report about North Carolina real estate agent Dennis Burgard's one-man protest against the state's lockdown, Audrey Whitlock started a Facebook group called ReopenNC.
Less than a week later, it had 20,000 members and had spawned dozens of copycats in other states and countries.
On April 13, between 200 and 300 protesters surrounded the Ohio statehouse, pressing against the glass doors while officials briefed reporters on the daily new case numbers. The next day, in North Carolina, Burgard was joined by around 150 others, resulting in at least one arrest for violating the state's stay-at-home order.
On April 15, thousands of protesters crowded the streets of Lansing, Michigan, stopping traffic and delaying a shift change at a nearby hospital. Two days later, 800 protestors targeted the Minnesota governor's residence, encouraged by a tweet from President Donald Trump calling on them to "LIBERATE MINNESOTA!".
As the rallies spread around the country, and eventually the world, they remained bound by two things: the outlandish conspiracy theories that some, but not all, of the protesters endorsed, and the use of Facebook to organise and propagandise.
Some blamed 5G technology for the outbreak of the virus, others posited it was a plot by Bill Gates to enforce mandatory vaccination on the global population. These theories were endlessly espoused and debated on the various Reopen Facebook groups and then made the jump to real-world protests. Assault-style firearms and anti-Semitic were also a common sight at the rallies. Masks and social distancing were rarer.
New Zealand wasn't immune to this behaviour either. On May 16, 70 people turned out in Nelson to listen to anti-1080 lawyer Sue Grey falsely inform them the Level 2 restrictions were guidelines and not law. The same day, a dozen marched down Auckland's Queen Street and around 50 gathered outside Parliament in Wellington. The Telegram channels that organised these events had just a handful of members, but their turnout was bolstered when the plans were shared to Facebook.
Then, the predictable happened. Whitlock revealed on April 28 she had tested positive for Covid-19 nearly two weeks earlier, even though she had attended at least one rally in the intervening period. On May 8, Wisconsin health officials revealed that 72 people had tested positive for Covid-19 after attending a large gathering, thought to be an April 24 anti-lockdown protest.
Facebook's inability to rein in misinformation has led to demonstrable harm and the exacerbation of the Covid-19 pandemic.
It has set a high threshold for removing misinformation about the virus - only that which "could contribute to imminent physical harm" - and places ineffective warnings on other fake news. Misleading text posts that don't meet the threshold are not subject to these warnings and are able to be rapidly shared across the platform.
It has failed to adequately moderate private groups, where tens of thousands of people are free to share false information without any serious intervention. Even misinformation that does meet the high bar for removal, such as incitement to sabotage cell towers, is often left standing in private groups that operate as a Wild West beyond the reach of the company's best moderation tool: Public scrutiny and reporting. Newsroom has found multiple instances of such posts still available to be read and interacted with more than six weeks after they were first made.
Facebook has made numerous commitments in recent years to clean up its platform.
After revelations that fake news about the 2016 United States presidential election had been stoked by state actors on Facebook, CEO Mark Zuckerberg promised to go after misinformation with a vengeance.
After a terrorist livestreamed on Facebook the murder of 51 Muslims in Christchurch on March 15, the site pledged to rid itself of harmful content.
The Covid-19 pandemic is Facebook's Sandy Hook moment.
On December 14, 2012, a 20-year-old man stormed Sandy Hook Elementary School in Connecticut and killed 20 children and six adults. At the time, it was widely assumed this would be a turning point for the United States. Surely the leader of the free world would take action on its lax gun laws after a man used them to murder 20 first-graders?
Facebook faces a similar moment. Will this be a turning point where it bucks shareholder pressure, cracks down on private groups and deals more harshly with all harmful misinformation? Or will it take the easy way out by ducking responsibility for what users post, allowing bad actors to use the platform to spread misinformation, encourage people to turn out to illegal rallies and worsen the spread of a deadly virus that has already claimed 300,000 lives worldwide?
So far, Facebook has chosen the second option.
Why focus on Facebook, when there are dozens of social networks out there and numerous websites that serve as a far safer haven for conspiracy and extremist views?
In part, it's a matter of scale. Facebook has more monthly active users than any other social media site - 2.6 billion, to YouTube's 2 billion, Instagram's 1 billion and Twitter's 330 million.
More importantly, it's a question of audience.
While Twitter is often seen as a hotbed of far-right and fringe conspiracy activity, the people who access the platform are far more likely to be interested in exploring or debating political ideas in the first place. People join Twitter to participate in niche communities and engage in public fora.
By contrast, Facebook is used to keep up with friends and families. People join Facebook to see updates about nieces and nephews, post pictures of their vacations and share memes. Despite its more recent evolution into a news platform, Facebook has long had an emphasis on community, friends and family - and people log on for those purposes.
Then, unexpectedly, they get sucked in.
Yes, some of the people now raving about 5G or mandatory vaccination may have ended up doing so without Facebook. But many more would never have been converted to conspiracy theorists in the first place, Dr Catherine Strong, a senior journalism lecturer at Massey University who studies social media and fake news, told Newsroom.
"People who would not have normally gone to seek that type of information are getting it. People who would never protest are protesting against lockdown, saying it’s against their rights, because things on Facebook have given them the emotional feeling that this is all made up anyway and that it’s been overstepped," she said.
"It’s sucking them in. Sucking them in to political things that they don’t even realise are political. This is not necessarily a site you go to if you were trying to analyse politics or if you actually were really into conspiracy theories. It’s laced with lovely dog pictures and the family photos, so it’s almost by stealth, it integrates their world with conspiracies."
This is not an uncommon experience in New Zealand. Hamilton journalist Oskar Howell told Newsroom about an associate of his who had never shown any signs of conspiratorial thinking in the time he knew her.
Then, in the midst of the pandemic, she began posting about 5G.
"She was a critical thinker and always a fan of a good debate, but she never gave off the vibe that she would be into 5G. It came as a total surprise to see her in the group and actively contributing, without doing any research or using common sense," he said.
"5G is concerning because it seems to be sucking in regular people looking to project their uneasiness or anger or whatever it is."
'You can't blame a microbe'
Uneasiness, Auckland University medical sociologist Robert Bartholomew told The Detail, is exactly what it is.
"Often people who believe in the 5G conspiracy theories are lumped in with fringe people who have mental health issues. But it's often just normal, healthy people," he said.
"During times of crises, conspiracy theories flourish because they are incubated in an atmosphere of uncertainty and anxiety. It's the same atmosphere that incubates rumours. And when it's viewed as being a personal threat to one's health, that makes it more pertinent to people."
Strong put forward a similar theory.
"There are a whole lot more vulnerable people, because the pandemic is just too enormous to comprehend. Absolutely too tremendous. And with any trauma, people go through stages of denial. Of hate. And then of attribution. They want to attribute this to someone," she said.
"And if they can attribute it to 5G towers, if they can attribute it to somebody or some thing or even attribute it to the media for overplaying it when 'it’s really not that real, there’s not that many deaths, they’re making it up', then they can make sense of it. You can’t blame a microbe."
A survey from the University of Otago also indicated this. Ana Stojanov, from the university's Department of Psychology, reported the survey found people "who reported lower perceptions of general control over their lives reported that they were more affected by the crisis. These participants also reported higher belief in Covid-19-related conspiracy theories, consistent with the idea that lack of personal control may indeed motivate belief in conspiracy theories."
University of Waikato epistemologist and conspiracy theory expert M Dentith told The Detail that the scientific nature of the crisis is just too difficult for many people to grapple with.
"This is the thing about the Covid-19 pandemic. Most of us don't know much about epidemiology. Most of us don't know much about virology. The claims being made about exactly what's going on are horrendously complex," he said.
"So people are just searching for answers of some kind. And when people put forward a conspiracy theory which provides a slightly simpler explanation of the events at hand, people end up going, 'Well, I don't know whether it's true, but I'm willing to entertain it.'"
Facebook facilitates the spread
That can't be separated from Facebook, Strong believes.
"Facebook tends to collect up emotions – all sorts of emotions – and get them mixed up with current events. So people who are emotionally on the edge with the pandemic, they’re in fear. And then this [conspiracy] feeds into it. That’s what Facebook does well, is it gives people an opportunity to support something they want to be real. And it’s not real," she said.
Howell's friend isn't the only person who's been sucked into conspiracy theories through Facebook. Anecdotes of such events are widespread. In her expose on the QAnon conspiracy theory for The Atlantic, Adrienne LaFrance describes how one woman fell headfirst into the pro-Trump delusion.
Lorrie Shock, a former factory worker and Q devotee, introduced herself to LaFrance by promising that QAnon supporters are "not a domestic-terror group". She found out about Q through Facebook in 2017 and spent hours a day researching the subject - again, on Facebook.
Shock and a Q-supporting friend "rely on information they encounter on Facebook rather than news outlets run by journalists. They don’t read the local paper or watch any of the major television networks," La France relates.
This is where they came to the majority of their beliefs, such as the notion that Michelle Obama is secretly a man and that John F. Kennedy Jr. faked his death and is now a military intelligence official allied to Trump in his fight against the "Deep State".
It isn't just researchers and journalists who have observed how effective Facebook is at converting regular people into hardcore conspiracy theorists - the conspiracists themselves have noticed too.
In private channels on the encrypted chat app Telegram - a haven for extremists of all kinds - the New Zealand anti-lockdown protesters discussed plans to recruit others.
Damien de Ment, a far-right New Zealand-based YouTuber asked members of the "ReopenNZ" Telegram channel, "how are You expanding your personal reach? Are you sharing in an echo chamber or are you reaching LEFT and widening the pool of awake Patriots?" [sic].
"It's important that we increase volume of memes in social media sharing and try to infiltrate mainstream NZ hashtags to carpet bomb with targeted memes / topics to maximize 'curiosity,'" he wrote.
In response, another user, who moderates the anti-vaccination Facebook group FACTS NZ, wrote that she "put up a post and boosted it for $10. I plan to do one a day. I wrote about it in the facebook group."
When asked how best to reach out to others, the same user linked a three-part guide to "red pilling" - a term from the Matrix used originally by men's rights activists, and later the rest of the far-right, to describe "waking up" to the way the world really is. In other words, if you want to red pill someone, you want to convince them of your worldview.
Facebook helps maximise the reach of such groups. In planning a May 23 rally in Wellington, one user wrote, "Is anyone doing a facebook event for protest at parliament on Saturday?"
"I will post the details of this Saturday as an announcement on the Facebook group today, like last week. Do we need to make a Facebook event as well do you think?" another responded.
"Yeah I do think that's a good idea as it's easier to invite people," the first wrote back.
Then a third user pitched in. "Here is a Farcebook [sic] trick," they wrote.
"Don't make a page or a fb group, make an actual 'person'. As a PERSON, you can go and join stacks and stacks of Groups! Pick the ones with large numbers. Write one update and post that to say 50 groups. Hit 200 thousand [people] with same info in 5 mins."
Despite having just 16 members in the Telegram chat organising the previous Wellington protest, three times as many people showed up in real life after the event was advertised on Facebook.
Minimisation a gateway
Despite the ease with which Facebook facilitates red pilling, most people don't go from having middle-of-the-road beliefs to becoming convinced that 5G is at the root of the pandemic in just one leap.
The process begins with minimisation, which is why Strong sees efforts to downplay the seriousness of the virus as not just dangerous, but also a form of misinformation.
"You can just see a lot of the postings that are trying to minimise it, saying it’s just like the flu, saying 'Hey, more people die of this, that or the other than the coronavirus'. And the postings that say 'Oh but they’re counting deaths when they really shouldn’t be counted'. They’re getting spread, spread so fast and so wide that people who would not have normally gone to seek that type of information are getting it," she said.
"These ones that are minimising the effect of coronavirus is what I think is the danger part of Facebook at the moment: minimising it. They want to believe 'This I can put into my normal life. I have lived with people with cancer, I have lived with people with the flu, I have lived with people with diabetes, and if this is put in that same pool, I can normalise it.'"
The guide to red pilling shared in the Telegram channels makes a similar point.
"Coax them in the right direction. The last thing you want is to bludgeon them with too much information," it begins, before providing a list of things that shouldn't be brought up initially, including aliens, Satan, chemtrails and crop circles. Then it offers a list of "red pills that are easy to swallow".
"These messages have a higher chance of reaching more people without being dismissed outright," the guide states. It recommends using memes as "most people browse social networks for the funny memes. This tactic works best when you hijack something universally known, or actual/big events."
Minimisation of the virus quickly verges into outright conspiracy theory, such as the false but frequently-echoed notion that all deaths in hospital in the United States are being considered Covid-19 deaths.
5G conspiracy emblematic
Then, things get a little darker and a lot more bizarre.
The 5G theory is emblematic of conspiracy in the age of Covid-19. It is rooted in a preexisting conspiracy theory - the false idea that 5G is dangerous and that the government, telcos and scientists are all lying about it - but has taken on a life of its own with the arrival of the coronavirus.
NZ Compare, a consumer advice and transparency firm, released a survey in December that found 46 percent of Kiwis are concerned 5G might affect human health and a third were worried about its impact on animals and plants. Such worries are unfounded, according to scientific experts, but the movement opposing 5G is surprisingly widespread, as Newsroom reported in October.
Anti-5G protesters were concerned the technology could kill bees or give humans cancer, but the Prime Minister's chief science advisor has launched a new website to dispel these myths.
"The radio waves used for 5G have frequencies that are ten thousand times too low to damage molecules," the website states.
"Radio waves can heat our body if we are over-exposed to them. However, these effects can only occur when exposed directly to a very powerful source so that the heat builds up enough to damage tissue before it dissipates. 5G sources are simply not powerful enough to cause damage in this way.
"Many researchers have explored possible connections between radio frequency radiation and cancer and as is often the case when there are many separate studies, a small number have reported an association between exposure and cancer, such as mobile phone use and brain tumour risk.
"Significantly more high-quality studies have found no associations, including studies funded by cancer research organisations. The clear conclusion reached internationally, supported by health authorities in New Zealand, is that exposure to this type of radiation at levels experienced in New Zealand is not hazardous."
A quick comparison between the countries that have deployed 5G and those that have suffered the worst from Covid-19 shows there is little correlation. For example, France, which has the seventh most cases and more than 28,000 deaths, has no 5G network. Likewise, the Netherlands ranks 20th in total infections but has no 5G network.
On the other hand, the Philippines, which has had 5G since July, has reported just 13,221 cases of the virus. South Korea, which had one million 5G users in June of last year, also has a quarter of the number of cases the Netherlands has. Even New Zealand, which launched 5G in December, has had only 1503 cases and 21 deaths.
"I can’t state it clearly enough. I almost hesitate to speak to it on this platform - it is just not true," Prime Minister Jacinda Ardern said of the conspiracy theory in April.
Yet, it has spread like no other. An anti-5G Facebook group in New Zealand that had about 5000 members prior to the virus outbreak has now more than doubled in size to 13,000, at the same time as critical mobile network infrastructure has been targeted by would-be saboteurs.
"I was told today that Jacinda is turning on the 5G radiation towers as soon as June," one user wrote in the official ReopenNZ Facebook group, seemingly not aware, despite their passion for the conspiracy theory, that 5G has been active in New Zealand since December.
Vaccination, QAnon also popular
Almost as popular as the 5G theory are ones surrounding vaccination. Yet again, a preexisting false idea - that vaccines are dangerous and that someone is covering it up - slots perfectly into the pandemic world, where news reports breathlessly relate the latest updates in the quest for a global vaccine.
"There is already a number of vaccines waiting in the wings...have been for some time now....they're just making a pretense at inventing" one user incorrectly wrote in the ReopenNZ Facebook group, which has collected more than 1500 members since May 8.
The idea that any vaccine will be mandatory - which has not been suggested by any decision-maker in New Zealand - is a major fear for anti-vaccine advocates.
"The new bill is for when they put us back in level 3 or 4 for mandatory door to door testing and vaccination," a ReopenNZ poster falsely stated during a discussion on the Level 2 restrictions legislation.
These intimations of a global plot to imprison and vaccinate the population also feed into the one conspiracy you might not expect to find in New Zealand but which is in fact surprisingly widespread here: QAnon.
This theory - or perhaps it is better described as a quasi-religious belief system - posits that a pedophile cabal of global elites, including Hillary Clinton, Bill Gates and, for Kiwi devotees, Jacinda Ardern, operate a worldwide "Deep State" against the will of most national populations. In the QAnon mythology, a lone military intelligence officer with Q clearance - a term from the United States Department of Energy - posts cryptic hints on anonymous message boards about the coming mass arrest of the cabal, on the orders of Donald Trump.
As insane as it sounds, QAnon is enormously popular in the United States, with some prominent YouTube videos on the subject racking up more than a million views. In New Zealand, too, it punctuates the conspiratorial rantings of the Reopen protestors, both on Facebook and Telegram.
A Facebook ReopenNZ group member linked others to a website that notifies users whenever "Q" makes a new post. On Telegram, someone wrote they were "just waiting for Q to take over the tv network".
Real world consequences
These theories may sound ridiculous, but the easy dissemination of them through Facebook has real world consequences.
In the United States, nearly every Reopen protest has been coordinated through Facebook, often in violation of local stay-at-home orders.
More reports of the spread of the virus at these rallies are likely to emerge in the coming weeks as Covid-19 continues to saturate the country - nearly one in every 200 Americans has now contracted the virus and three-quarters of them have yet to recover.
Just as it is not immune from the misuse of Facebook or the spread of conspiracy theories, New Zealand has also seen real life impacts.
The explosion of the 5G conspiracy in New Zealand has led to attempts to sabotage or burn down more than a dozen cell towers. None of the targeted sites have been 5G-related infrastructure, but telcos are warning that if the attempts continue, Auckland could lose internet and mobile receptions. Even calls to 111 might be blocked if enough damage is done.
This is not a prank and it isn't minor vandalism either. It's an active, politically-motivated attempt to target and destroy critical infrastructure that could endanger human life - the definition of a terrorist act under the Terrorism Suppression Act 2002.
Nonetheless, posts seeking instruction on how to sabotage telecommunications infrastructure flourish on Facebook. In one post to the anti-vaccination FACTS NZ group, a user posted a picture of a cell tower asking if it was 5G-related. In the comments, others urged her to burn it down or blow it up. The comments remain up more than six weeks later.
Although it is not known if non-compliance with the lockdown led to any infections in New Zealand, there were numerous cases of such breaches, leading to thousands of prosecutions. In the Facebook comments on the daily livestreamed press conferences throughout the lockdown, people regularly questioned whether the virus was real and encouraged others to violate the Level 4 rules.
At least one individual charged with breaching the rules is known to have been influenced by fake news on Facebook. In the days before he filmed himself coughing on strangers in a Christchurch supermarket - an act that earned him the label of "idiot" from the Prime Minister herself - Raymond Coombs posted several fake articles and memes about the coronavirus on his Facebook page.
Overseas, the University of Oxford has found that belief in conspiracy theories reduces compliance with public health measures, including social distancing, staying home, not visiting friends and family during lockdown, willingness to be tested for Covid-19 and wearing a face mask.
"Conspiracy beliefs are likely to be both indexes and drivers of societal corrosion. They matter in this context because they may well have reduced compliance with government social distancing guidelines, thereby contributing to the spread of the disease. One consequence of this national crisis may be to reveal fully the harmful effects of mistrust and misinformation," the study found.
"Higher levels of coronavirus conspiracy thinking were associated with less adherence to all government guidelines and less willingness to take diagnostic or antibody tests or to be vaccinated. Such ideas were also associated with paranoia, general vaccination conspiracy beliefs, climate change conspiracy belief, a conspiracy mentality, and distrust in institutions and professions."
As the country moved to Level 2, many members of the various conspiracy Facebook groups discussed ways to avoid contact tracing at hospitality outlets, including the provision of fake names. Director-General of Health Ashley Bloomfield said such actions put the health of others at risk.
"I just encourage people to be honest, participate," he said.
The role of Facebook in amplifying misinformation and incitement to non-compliance has been noticed by Government as well. On Wednesday, Jacinda Ardern told Newsroom that while she hadn't specifically raised the issue with Facebook, it was something cyber policy experts were examining.
"I was just talking to the team this morning about how important I still see the work around algorithms and transparency around algorithms, because there is certain misinformation that leads people into some pretty dark places," she said.
"There's been a connection through this period of various conspiracy theories combining into uber-conspiracy theories that are really unsettling for people. There is a lot more work to be done, and so if there's anywhere that I continue to maintain a real interest in the ongoing work of the [Global Internet Forum to Counter Terrorism], it will be things like that algorithm work."
Ardern also said the Christchurch Call's focus on terrorism and violent extremism wasn't meant to limit the conversation around other forms of misinformation.
"Our 'in' to this conversation was an area where you just can't disagree. No one is going to stand here and argue for violent extremist content and terrorist content online. And I would hope we get to a place where we will also equally have a view that misinformation, dangerous misinformation, actually we all have a role to play in that as well."
Facebook isn't doing enough
Facebook has made a big deal of the action it has taken to fight misinformation about the virus.
Governments around the world have partnered with the platform to disseminate relevant local information about public health responses and requirements. Facebook is also working with the World Health Organisation to provide better access to true information.
After a damning report in early April found 29 percent of English-language coronavirus misinformation was not labelled as false and that Facebook took up to 22 days to identify false information, the company rolled out a system that directed every user who had seen false information to WHO sources. That system has now flagged the WHO site to more than two billion users who were exposed to misinformation, Facebook says, about one sixth of whom have clicked through.
"We’re aggressively going after misinformation about Covid-19 and have teams across the company dedicated to this effort. In April alone, we applied strong warning labels to more than 50 million of pieces of misinformation and removed hundreds of thousands of pieces of misinformation that could lead to imminent physical harm. We’ve also directed over two billion people to resources from the World Health Organisation through pop-ups and our dedicated Covid-19 Information Centre," a Facebook spokesperson said.
Near the start of the outbreak, the platform made a commitment to remove any "Covid-19 related misinformation that could contribute to imminent physical harm". This is largely limited to false claims about cures, treatments, the availability of essential services and the location or severity of outbreaks, Facebook says.
It does not seem to have extended to the conspiracy theories noted above, although the company insists that discussions of attacking 5G infrastructure and false claims 5G technology causes Covid-19 are subject to removal. In New Zealand, at least, this has not generally occurred.
"Once a piece of content is rated false by fact-checkers, we reduce its distribution and show warning labels with more context. Based on one fact check, we’re able to kick off similarity detection methods that identify duplicates of debunked stories," a Facebook media release stated.
"For example, during the month of March, we displayed warnings on about 40 million posts related to Covid-19 on Facebook, based on around 4000 articles by our independent fact-checking partners. When people saw those warning labels, 95 percent of the time they did not go on to view the original content."
However, such warnings are rarely effective in dissuading the conspiracy crowd. One recent post on the ReopenNZ received a "partly false" rating, which only further convinced the commenters of its veracity.
The warnings also only apply to links and videos. Text posts, which quickly go viral through copy-pasting to dozens of likeminded groups, are not subject to such warnings. These warnings are also generated in real time by human fact-checkers and only one fact-checking service is currently operating in New Zealand. While such an approach ensures quality of moderation, it is rarely quick enough to tackle misinformation at the speed it spreads.
The debut of the 'Plandemic' conspiracy video put Facebook's claims about dealing with misinformation to the test - and it utterly failed.
It took the platform some 48 hours to pull down the faux-documentary, which posits in QAnon-style that a cabal of global elites planned the pandemic to enforce mandatory vaccination on the world populace. During the time it was up, the video spread from Reopen group to Reopen group, was reposted by quack doctors with verified pages and went viral on QAnon channels, all the while racking up millions of views, according to a New York Times analysis.
Four months into a global pandemic, Facebook was unable to stop bad actors from weaponising its platform to spread misinformation to millions of people in the form of a simple 26-minute video.
For years, Facebook has promised to take responsibility for how its platform is being misused. For years, the company has pledged to clean up its act.
Surely the fifth most visited site on the internet, a company worth US$617 billion, will take action to stop the virulent spread of misinformation that is leading to the attempt of acts of terrorism in New Zealand and the furtherance of a global pandemic?
It's not too late for Facebook to change tack. As it stands, the platform still allows the dissemination of misinformation, as long as it doesn't "contribute to imminent physical harm". Changing that would be a good first step - instead of labelling nearly 90 million posts false over the past two months, just take them down.
Private groups are also a problem. Without the eyes of the public, able to report false claims to the site's human and algorithmic moderators, misinformation spreads like wildfire in places like ReopenNZ or FACTS NZ. Cracking down on private groups through the deployment of more robust fact-checking measures - or just preventing users from creating private groups about the virus - could help stymie the fake news time bomb ticking away behind closed groups.
There is much more the company could do. So far, this has been Facebook's Sandy Hook, but it's not too late for the company to turn things around.
Can you help our journalists uncover the facts?
Newsroom is committed to giving our journalists the time they need to uncover, investigate, and fact-check tough stories. Reader donations are critical to buying our team the time they need to produce high-quality independent journalism.
If you can help us, please donate today.