Social media played a part in Christchurch mosque massacres

Facebook was used to live stream one of the mosque massacres in Christchurch yesterday, but social media played a wider role as well in promoting and aiding and abetting the warped mentality of those cowards who killed so many defenceless and innocent people.

Margaret Sullivan, The Washington Post’s media columnist (via NZ Herald):  Social media platforms were used like lethal weapons in New Zealand. This must change now

Right from the twisted start, those who plotted to kill worshipers at two New Zealand mosques depended on the passive incompetence of Facebook, YouTube and other social media platforms.

They depended on the longtime priorities of the tech giants who, for years, have concentrated on maximising revenue, not protecting safety or decency.

They got it.

While Facebook, Youtube et al have a very difficult job trying to identify uses and abuses of their platforms by terrorists, especially in advance of terrorist acts, they have been far too lax in the past. The killings in Christchurch will put more pressure on them to change how they do things.

My colleague, Washington Post tech reporter Drew Harwell, summed up the social media disaster succinctly in a tweet: “The New Zealand massacre was live-streamed on Facebook, announced on 8chan, reposted on YouTube, commented about on Reddit, and mirrored around the world before the tech companies could even react.”

It gets worse. The brutality that killed at least 49 people and wounded many others was fueled and fomented on social media — inviting support and, no doubt, inspiring future copy cats.

One of the suspects had posted a 74-page manifesto railing against Muslims and immigrants, making it clear that he was following the example of those like Dylann Roof, who in 2015 murdered nine black churchgoers in Charleston, South Carolina.

All of it ricocheted around the globe, just as planned.

The platforms, when challenged on their role in viral violence, tend to say that there is no way they can control the millions of videos, documents and statements being uploaded or posted every hour around the world. They respond when they can, which is often with agonising slowness and far too late.

To the extent that the companies do control content, they depend on low-paid moderators or on faulty algorithms. Meanwhile, they put tremendous resources and ingenuity — including the increasing use of artificial intelligence — into their efforts to maximize clicks and advertising revenue.

This is far from the first time acts of violence have been posted in real time. Since Facebook’s live-video tool began in 2015, it’s been used to simulcast murder, child abuse and every sort of degradation.

But the tragedy in New Zealand takes this dangerous — and largely untended — situation to a new level that demands intense scrutiny and reform.

Reddit, for one, often takes the view that its users deserve to be treated like grown-ups, to see what they want to see.

As its representatives on Friday closed down a thread called “watchpeopledie,” where users commented on the massacre video, they sounded regretful:

“The video is being scrubbed from major social-media platforms, but hopefully Reddit believes in letting you decide for yourself whether or not you want to see unfiltered reality,” the post said. “Regardless of what you believe, this is an objective look into a terrible incident like this.”

Being scrubbed after the event has obvious problems – the information has already been widely circulated by then.

Where are the lines between censorship and responsibility?

These are issues that major news companies have been dealing with for their entire existences — what photos and videos to publish, what profanity to include.

Editorial judgment, often flawed, is not only possible. It’s necessary.

The scale and speed of the digital world obviously complicates that immensely. But saying, in essence, “we can’t help it” and “that’s not our job” are not acceptable answers.

Friday’s massacre should force the major platforms — which are really media companies, though they don’t want to admit it — to get serious.

This also matters for minor platforms, like Your NZ. I have always tried to discourage, and where necessary remove, the worst  attempts at spreading and fomenting hate and intolerance, but it’s a difficult job constantly monitoring things and getting the balance right between free speech and acceptable speech.

Kevin Roose (NY Times):  A Mass Shooting of, and for, the Internet

The details that have emerged about the Christchurch shooting — at least 49 were killed at two mosques — are horrifying. But a surprising thing about it is how unmistakably online the violence was, and how aware the suspected gunman appears to have been about how his act would be viewed and interpreted by distinct internet subcultures.

In some ways, it felt like a first — an internet-native mass shooting, conceived and produced entirely within the irony-soaked discourse of modern extremism.

The suspected gunman, who has been charged by the New Zealand authorities, teased his act on Twitter, announced it on the online message board 8chan and broadcast it live on Facebook. The footage was then replayed endlessly on YouTube, Twitter and Reddit, as the platforms scrambled to take down the clips nearly as fast as new copies popped up to replace them. In a statement on Twitter, Facebook said it had “quickly removed both the shooter’s Facebook and Instagram accounts and the video,” and was taking down instances of praise or support for the shooting. YouTube said it was “working vigilantly to remove any violent footage” of the attack. Reddit said in a statement that it was taking down “content containing links to the video stream or manifesto.”

Even the language the suspect used to describe his attack before the fact framed it as an act of internet activism. In his post on 8chan, he referred to the shooting as a “real life effort post.” He titled an image “screw your optics,” a reference to a line posted by the man accused in the Pittsburgh synagogue shooting that later became a kind of catchphrase among neo-Nazis. And his manifesto — a wordy mixture of white nationalist boilerplate, fascist declarations and references to obscure internet jokes — seems to have been written from the bottom of an algorithmic rabbit hole.

It would be unfair to blame the internet for this. Motives are complex, lives are complicated, and we don’t yet know all the details about the shooting. Anti-Muslim violence is not an online phenomenon, and white nationalist hatred long predates 4Chan and Reddit.

But we do know that the design of internet platforms can create and reinforce extremist beliefs. Their recommendation algorithms often steer users toward edgier content, a loop that results in more time spent on the app, and more advertising revenue for the company. Their hate speech policies are weakly enforced. And their practices for removing graphic videos — like the ones that circulated on social media for hours after the Christchurch shooting, despite the companies’ attempts to remove them — are inconsistent at best.

We also know that many recent acts of offline violence bear the internet’s imprint.

People used to conceive of “online extremism” as distinct from the extremism that took form in the physical world.

Now, online extremism is just regular extremism on steroids. There is no offline equivalent of the experience of being algorithmically nudged toward a more strident version of your existing beliefs, or having an invisible hand steer you from gaming videos to neo-Nazism. The internet is now the place where the seeds of extremism are planted and watered, where platform incentives guide creators toward the ideological poles, and where people with hateful and violent beliefs can find and feed off one another.

So the pattern continues. People become fluent in the culture of online extremism, they make and consume edgy memes, they cluster and harden. And once in a while, one of them erupts.

In the coming days, we should attempt to find meaning in the lives of the victims of the Christchurch attack, and not glorify the attention-grabbing tactics of the alleged gunman. We should also address the specific horror of anti-Muslim violence.

At the same time, we need to understand and address the poisonous pipeline of extremism that has emerged over the past several years, whose ultimate effects are impossible to quantify but clearly far too big to ignore.



  1. patu

     /  March 16, 2019

    Social media may also play a big part in stopping any further such acts. According to others on Facebook, who looked at the main offender’s time line prior to it mysteriously vanishing around 5:30pm last night, he posted pictures of his arsenal in February. I’d be very surprised if everyone on his friends list didn’t get a visit from their local police sometime this weekend…

  2. David

     /  March 16, 2019

    They are private companies they should be able to do what they want. Who becomes the worlds hall monitor deciding what we are allowed to see, should everything we see have the filter of a liberally educated news editor.
    We shouldnt be shielded from the horror we should recoil from it and condemn it. If you dont want to see it dont look at it.

    • “They are private companies they should be able to do what they want.”

      Only in a perfect world, which we obviously don’t have.

      When media companies actively help foment acts of horror then they need to be pressured or regulated into doing more to minimise the risks.

    • Patzcuaro

       /  March 16, 2019

      Private companies act to maximize revenue and minimize costs, self regulation is a cost which is minimize in the pursuit of profit. You only have to look at Pike River to see the result. If the tech companies spent as much money on rooting out the underbelly of the internet as they do generating revenue the internet would be a better place.

    • Missy

       /  March 16, 2019

      Media companies are (on the whole) private companies, yet they are subject to legislation around what they can and cannot put on their sites, why are social media companies different? Why shouldn’t they be legislated about their content?

    • Kitty Catkin

       /  March 16, 2019

      David, if someone records a rape or murder, do we have the right to see it ?

      Do we have the right to see animal cruelty ?

      Or to see someone jumping from a building or being executed ?

    • Kitty Catkin

       /  March 16, 2019

      It’s not just social media, it’s the sites that tell outright lies about Muslims, some of which have been posted here; the supposed pub-wreckers was an obvious example. The emotive language used ‘Muslims demand women-only swimming pools/prayer rooms/halal meat’, and the racist names used of them which I won’t dignify by repeating. We all know who the person is who does this here. Before that, someone used to quote ‘Bare, naked Islam’, which is lies.

      We were told that Muslims burned down a church, the oldest in Germany…it wasn’t, there was not the vast crowd of Muslims who supposedly did it (I forget the number, it was something like 12,000) near it and the blaze was caused by fireworks.

    • Alan Wilkinson

       /  March 16, 2019

      That’s a pretty stupid comment from a professor unless it’s grossly out of context. Has he really no knowledge of what’s been happening around the world?

      • Duker

         /  March 16, 2019

        he said ‘we’ .
        he wasnt talking about ‘rest of world. Neither are you really but you do avoid minimising its your fellow travellors – righties- who do these ethings

        • Alan Wilkinson

           /  March 16, 2019

          B.s. He’s criticising the security services for focusing on Islamic extremism which is blatantly ridiculous considering its threats. He may possibly have been right had he just criticised them for not giving sufficient attention to Muslim-haters but he didn’t. He also seems entirely premature in making claims without any foundation when we don’t know if they missed opportunities to detect and stop this or not.

          If reported fairly I think he’s both biased and foolish. Your parting slur reflects on you too.

          • Gezza

             /  March 16, 2019

            Yes and no. Interviewed on AlJazeera he is saying precisely that our security services were looking in the wrong places, for Islamic terrorists, when they should have been looking out for white supremacists. There’s no mistaking what he’s saying there. He’s in their video clips saying it.

            • Alan Wilkinson

               /  March 16, 2019

              Ok, then I take back my absolution for his stupidity. Not really a surprise for Waikato academics though – mostly wetter than their river.

    • The Consultant

       /  March 16, 2019

      I’d bet that Mr Gillespie will love looking under rocks to find his political and ideological opponents.

      • Kitty Catkin

         /  March 16, 2019

        But Trump says that white supremacists are not a real problem, apart from a few with mental health problems….

  3. Bill Brown

     /  March 16, 2019

    NZ has taken a turn that was most likely coming but never wanted.

    It’s one, if not the saddest day in our history

    We will never be the same again

  4. Trump deletes Christchurch mosque massacre tweet

    US President Donald Trump has now deleted his first tweet in response to the Christchurch mosque massacres, which was posted around 8.15pm last night NZT.

    The tweet featured no words of condolence – or any words – just a link to a story on the tragedy posted on alt-right site Breitbart News.

    The article was actually a very straightforward account, by Breitbart standards, which drew on material from the Associated Press. However, some of its comments were deeply offensive, flippant and nasty.

    As a regular Breitbart reader, the President would have known the likely tenor of comments.

    Ten hours later, Trump finally offered a more human response, tweeting a condemnation of the massacre and standing in support of NZ.

    That’s part of a pattern of the US president taking a pause before speaking out against racist violence and other racist incidents.

    And when asked if he thought the Christchurch massacres reflected a rising global threat from white nationalism, he responded, “I don’t, really. I think it’s a small group of people with serious problems.” It’s hard to imagine he would have taken such a mild, mental health line if the shooter had been Muslim.

    • Duker

       /  March 16, 2019

      Yes narcissism isnt really such a small group of people – Trump is a prime example .

      • Kitty Catkin

         /  March 16, 2019

        A dear friend was trapped into marrying one.She made his life a misery (to put it mildly) isolating him from friends as people gave up on seeing him after being made to feel like unwelcome intruders. I don’t blame him for not standing up to her; nobody could.

  5. High Flying Duck

     /  March 16, 2019

    The “manifesto” is very disturbing and full of misdirection.

    Click through to Bellingcat to see his take. The 8Chan comments are very unsettling.

    • Kitty Catkin

       /  March 16, 2019

      Someone in the US has suggested that the murderer is actually a leftie agent provocateur, who’s made this elaborate plot, manifesto and all, to discredit the right….I don’t think so. He could have been shot, he will be behind bars for life. Nobody’s that willing to discredit others.