Facebook: a tiny step in reining in a Goliath of global power

9:17 pm on 15 May 2019

By Jess Berentson-Shaw and Marianne Elliott*

Opinion - With Facebook's announcement today that they will ban users from their live streaming services if they "violate our most serious policies", we are seeing the corporate start to take responsibility for cleaning up the mess they have made of our online spaces, and democracy with it.

No caption

Photo: 123RF

This move speaks to the power of citizens, and their elected representatives, to effect the policies and practices of hugely powerful global institutions. We should celebrate this first small step as a win for democratic action.

However, we should be clear that Facebook is only doing the minimum we demand from other businesses. That is, to take responsibility for the harm that their product is causing to the people who use it. As we require oil spills and pollutants to be cleaned up by those who are responsible for the degradation caused to our environment by such events, so Facebook must clean up the mess it has made of our democracy - the spread of hatred and extremism and the resulting inability of people to participate in all parts of our society (including the internet) free from hateful speech and actions.

Because hateful speech on the internet, and extending into offline spaces, is shutting people out of our society, out of public conversation, telling them their experiences and ideas do not matter, making them feel unsafe and unwelcome. And if people are shut out of the conversation because they are targeted by hatred, the conversation is not a free one. Rather, it is a huge constraint on many people's freedom.

In our research on the harm digital media does to our democracy, we found that social media as it currently operates does not generally act to open up public conversations, to include people who otherwise are excluded.

Rather, where hatred is allowed by the platforms towards minority groups, they are often shut out. A spiral of silence occurs, where those already less inclined to join public conversations spiral further out of the conversation due to the harmful narratives rife on social media.

More generally, people's trust in democracy is being undermined by the manipulation of information on social media. Such manipulation of information is not a new social phenomenon, but people have found a uniquely supportive environment to spread it in digital media.

Yet, it need not be this way. The opportunities that social media, and digital media more widely, present to develop a more inclusive democracy are still there. In the survey of New Zealanders that informed our research, we also found that people from minority ethnic groups use social media more than other groups specifically to connect with politicians and engage in political issues. Digital media does facilitate inclusion under the right conditions.

To build on these opportunities, and expand the conditions for inclusion, we need Mark Zuckerberg, and the CEOs of other social media platforms, to take responsibility for preventing that hate from being spread across the internet in the first place. Just like we need to address the practices of the fossil fuel industry that lead to tankers of oil being driven onto coral reefs. Prevention is better than cure.

Facebook can't be relied upon to self-regulate

While Facebook has voluntarily taken this first step, we cannot rely on this Goliath of global power, which operates in a monopolistic framework, to self-regulate to prevent harm, not when so much profit is involved.

Our governments must recognise the threat to democracy that is posed by the huge power a handful of privately-owned platforms wield over so many aspects of our lives. They need to act to regulate these corporations, as well as explore and test a number of other actions to build an environment which ensures corporations are properly responsible for protecting the people who use their services.

The algorithms utilised by people in these businesses are using our personal data in unprecedented and unbalanced ways to drive content to people, and there is very little transparency about how they work or accountability for their impact, especially in terms of radicalising people to hateful ideas and actions.

What this first step from Facebook today shows us is we do have the power to effect real change in these very modern institutions who determine so many aspects of our lives. And more power to us.

*Dr Jess Berentson-Shaw is a researcher and public policy analyst. Marianne Elliott is a researcher and human rights lawyer. Together they founded The Workshop, a research and policy collaborative.

Get the RNZ app

for ad-free news and current affairs