Sunday, 16 May 2021

Israel-Palestine Conflict Highlights Why Diversity at Twitter and Facebook is Critical



Irrespective of your view on the Israel-Palestine conflict currently raging, there are important issues and lessons for diversity on social media platforms like Facebook and Twitter beyond the conflict itself.

Social media platforms often seem like neutral spaces. They simply host uploaded content. However, they and the people employed by them, are making editorial decisions constantly. These decisions include, for instance, which hashtags to promote and/or block, which content to promote and/or block, which users should be promoted and/or blocked, and in a broader context what entire issues should be promoted and/or blocked.

The blocking of hashtags recently made international headlines as posts containing the hashtag #AlAqsa or its Arabic counterparts #الاقصى or #الأقصى on Instagram were either taken down or hidden from search results. Facebook, which owns Instagram, said this was a mistake and was due to the Al-Aqsa name being associated with terrorist organizations.

A few days earlier Twitter blamed technical errors for deleting posts and suspending accounts mentioning the possible eviction of Palestinians from the East Jerusalem neighbourhood of Sheikh Jarrah.

And, in what may at first seem to be a completely unrelated issue, last week Facebook announced that it would be removing groups and pages that discourage people from getting vaccinated against Covid-19, regardless of whether the information is true or not.

I highlight these three examples because irrespective of your political views they illustrate that social media platforms are making subjective editorial decisions, not just simply objective decisions ones based on the factual accuracy of the posts. These decisions are either made directly by human moderators or indirectly by algorithms implementing programming by human programmers. The examples of subjective moderation often only draw public attention when "mistakes" are made or they are particularly contentious, but the reality is they are being made every day.

Now once this is recognised, when it comes to diversity and media representation it is important to know the demographics (racial, gender, disability, socio-economic etc) of the people making these decisions.

To put it at its most basic, if editorial decisions are being made about which content or terms are being blocked with regards to the Israel-Palestine conflict it would be useful to know the diversity of the humans either making the decisions directly or programming the algorithms to make those decisions.

And while this is true for media about Israel-Palestine, the arguments are transferable for almost any news story from Black Lives Matter to Brexit.

Unfortunately the social media platforms only give limited diversity information about their workforces. But even the limited information we do know is not encouraging.

For Facebook for example 89% of its US leadership is either White or Asian, and 91% of its US technical staff are either White or Asian. 

Conversely Black people, who make up 13.4% of the US population, and where Facebook's headquarters are based, make up only 3.4% of leadership roles and 1.7% of technical roles.

If one thinks about the Israel-Palestine conflict specifically “Additional [Ethnic] Groups” - which Arab-Americans would come under - make up 0.3% of leadership roles, and 0.2% of technical roles. (There are no separate statistics for Jewish-Americans)

Women are similarly underrepresented globally in leadership and technical roles at Facebook, at 34.2 and 24.1 respectively.

Twitter is marginally better than Facebook when it comes to both gender and Black representation, but both groups are still massively underrepresented in technical and leadership roles. And it does not produce statistics for either its Arab or Jewish workforce in the US.

Both social media platforms recognise they have to do better when it comes to diversity and have set out broad diversity targets.

Importantly, as far as my research goes, I can find no breakdown of the ethnicity or gender for either company - or any social media platform for that matter - for the people making the companies’ editorial decisions. But assuming the editorial diversity is at least partially in line with the companies’ leadership and technical roles diversity, we are left with the impression that important editorial decisions about content, users and globally sensitive stories are being made by an extremely unrepresentative group of people.

I have written copious amounts of articles on how this type of situation adversely affects the editorial decision making of newsrooms and conventional media organisations and all the arguments hold true for newer social media platforms.

It is now more important than ever that the diversity of the teams that are making editorial decisions on social media platforms is revealed. And just as Facebook and Twitter have set broad diversity targets in categories such as “leadership” and “technical”, they now need to set targets for editorial decision making teams.

As one Facebook employee wrote in a memo seen by the New York Times about wrongly blocking posts about the Al Aqsa Mosque; “These mistakes are painful, erode the trust of our community and there is no easy fix for that”. One of the best places to start to rebuilding that trust would be for their diversity to reflect the communities they are trying to serve.

1 comment: