Tech sector layoffs reduce content moderation and put human rights at risk

0
899

In 2022, the tech sector laid off 100,000 tech workers and significantly downsized human rights and content moderation teams. This is likely to lead to an increase in harm from social media platforms, which have been blamed for increasing hate and social divisions through their algorithms that prioritize such emotive content for greater engagement 

While not the original source of political or societal division, they tend to amplify extremist content which can enable mass harassment and manipulation. According to Lisa Schrich of the Kroc Institute at the University of Notre Dam, social media content moderation is not reaching enough people in order to prevent widespread disinformation. This is especially true for countries in the Global South, where companies lack sufficient staff that speaks local languages, thus leading to a failure to remove a significant amount of harmful content. 

Social media algorithms not only incentivize content creators to produce polarizing content but also disincentivize engagement from a wider range of voices in society. Ravi Iyer of the University of Southern California Neely Center argues that negative content discourages minority groups from participating in important social media discussions. To combat these harmful algorithms, Iyer proposes a few solutions. First, it is important to not rely solely on content moderation, as that would enforce censorship. Instead, he believes it is important to improve the design of social media platforms and to ensure that they evolve from a destructive to a constructive setting. Some of these solutions include: removing optimization for comments and shares in conflict-sensitive contexts, limiting the power of new and untrusted social media users with rate and distribution limits, providing accessibility privacy controls for users, and supporting on-platform efforts made by conflict transformation professionals. 

Furthermore, governments could consider imposing taxes on tech companies if they engage in harmful social media impacts. On the other hand, Lisa Schrich mentions that such companies could receive tax breaks and other financial incentives if they improve their social media designs and regulations.

Many experts, such as Simin Fahandej of the Baha’i International Community’s United Nations Office in Geneva and Christian Cirhigiri of Search for Common Ground question if regulations are even enough. When considering the role of regulations in international conflict, it seems that the UN Declaration of Human Rights often gets ignored. Thus, societies must take a greater stand in supporting these regulations, such as engaging in large-scale social movements.

These findings were discussed during the session, “Reimagining Technology: The Role of Social Media Algorithms in Promoting Social Cohesion,” during the 10th PeaceCon Conference held in Washington DC during May 3-5, 2023. 

Panelists:

Simin Fahandej, Baha’i International Community’s United Nations Office in Geneva 

Ravi Iyer, University of Southern California Neely Center

Christian Cirhigiri, Search for Common Ground

Lisa Schrich, the Kroc Institute at the University of Notre Dam

Tia_Savarese_Peacenews
Tia Savarese

Tia Savarese is a recent graduate of the George Washington University with a Bachelor of Arts in International Affairs with concentrations in Conflict Resolution and Security Policy and a minor in Spanish. She serves as the Commissioning Editor and Social Media Manager for Peace News Network while residing in Washington, DC. Before joining PNN, she interned for a communication firm, a think tank, and the federal government.