Helen Clark Foundation report: Harmful Content on Social Networks

Helen Clark backs Jacinda Ardern’s Christchurch call: ‘All key players should be there’

Former prime minister Helen Clark says those who aren’t attending the “incredibly important” Christchurch call meeting in Paris are saying more about themselves than the summit itself.

Speaking to Stuff ahead of releasing a report on reducing social media harm from her new think tank, Clark said the call was a “huge deal” and “all the key players should be there”.

“I think this says more about the people who are not going than the call itself. It’s an incredibly important call and why would those people not be there. That’s what will get the interest,” Clark said.

She said getting an issue like this on the table at a G7 meeting was “unprecedented” for New Zealand and praised Ardern for carrying on momentum.

“I think that New Zealand is going to be defined not the by the horrific attack itself, but he way she has responded. New Zealand is making a significant statement about who it is and what needs to be done locally and globally.”

The Helen Clark Foundation report key recommendation:

We recommend a legislative response is necessary to address the spread of terrorist and harmful content online. This is because ultimately there is a profit motive for social media companies to spread ‘high engagement’ content even when it is offensive, and a long standing laissez faire culture inside the companies concerned which is resistant to regulation.


Harmful Content on Social Networks

Executive Summary

Anti-social media: reducing the spread of harmful content on social media networks

  • In the wake of the March 2019 Christchurch terrorist attack, which was livestreamed in an explicit attempt to foster support for white supremacist beliefs, it is clear that there is a problem with regard to regulating and moderating abhorrent content on social media. Both governments and social media companies could do more.
  • Our paper discusses the following issues in relation to what we can do to address this in a New Zealand context; touching on what content contributes to terrorist attacks, the legal status of that content, the moderation or policing of communities that give rise to it, the technical capacities of companies and police to
    identify and prevent the spread of that content, and where the responsibilities for all of this fall – with government, police, social media companies and individuals.
  • We recommend that the New Zealand Law Commission carry out a review of laws governing social media in New Zealand. To date, this issue is being addressed in a piecemeal fashion by an array of government agencies, including the Privacy Commission, the Ministry of Justice, the Department of Internal Affairs, and Netsafe.
  • Our initial analysis (which does not claim to be exhaustive) argues that while New Zealand has several laws in place to protect against the online distribution of harmful and objectionable content, there are significant gaps. These relate both to the regulation of social media companies and their legal obligations to reduce
    harm on their platforms and also the extent to which New Zealand law protects against hate speech based on religious beliefs and hate motivated crimes.
  • The establishment of the Royal Commission into the attack on the Christchurch Mosques on 15 March 2019 (the Royal Commission) will cover the use of social media by the attacker. However the Government has directed the Royal Commission not to inquire into, determine, or report in an interim or final way on issues related to social media
  • platforms, as per the terms of reference.As a result, we believe that this issue – of social media platforms – remains outstanding, and in need of a coordinated response. Our paper is an initial attempt to scope out what this work could cover.
  • In the meantime, we recommend that the Government meet with social media companies operating in New Zealand to agree on an interim Code of Conduct, which outlines key commitments from social media companies on what actions they will take now to ensure the spread of terrorist and other harmful content is caught quickly and its further dissemination is cut short in the future. Limiting access to the livestream feature is one consideration, if harmful content can genuinely not be detected.
  • We support the New Zealand Government’s championing of the issue of social media governance at the global level, and support the ‘Christchurch Call’ pledge to provide a clear and consistent framework to address the spread of terrorist and extremist content online.

Helen Clark was interviewed about this on Q&A last night.