The commodification of online influence: Addressing disinformation from cyber-troops during elections in South Africa
ABSTRACT | There is a critical need to confront digital disinformation and protect democratic processes in South Africa, especially in anticipation of the forthcoming general elections in 2024. The landscape of online influence is multifaceted but has recently been disrupted by the emergence of “cyber-troops” and digital influencers who wield significant influence in shaping public opinion during election periods. Legal measures aimed at curbing the spread of false information are likely to be of varying efficacy. As a result, several recommendations and tools for identifying various forms of online disinformation can assist in combatting this modern threat to democratic participation.
Citation: P Sekati, The Commodification of Online Influence: Addressing disinformation from cyber-troops during elections in South Africa, ALT Advisory Insights 2023 (5) (12 December 2023).
+-*#*-+
INTRODUCTION
South Africa has witnessed several instances of politically-weaponised disinformation in recent years that leveraged computational propaganda and the strategic use of digital influencers. For example, individuals such as Sifiso Gwala, who are alleged to have adopted the online personas of @uLerato_Pillay and @MrHandsome_ZA, have capitalised on “follow trains”, which are groups of people who coordinate to follow each other in order to grow their social media following, using specific hashtags like #PutSouthAfricansFirst, to create echo chambers and, in the case of Gwala, to amplify xenophobic sentiments. Political parties such as the African Transformation Movement (ATM), Patriotic Alliance, ActionSA, and others have also been documented using follow trains through hashtag campaigns to garner widespread attention and amass voter support. These trends impact South Africa’s democratic processes by communicating sentiments that exacerbate societal divisions and raise questions about how online political communication should be regulated in South Africa as the country approaches the 2024 elections.
The already complex landscape of online influence has recently been further disrupted by the rise of what has come to be termed ‘cyber-troops’ — government or political party actors that use social media specifically to manipulate public opinion — alongside paid digital influencers — individual users that leverage their own brand and the trust they have built with their audience to influence behaviour. These relatively new actors play significant roles in shaping public opinion during elections. This raises the question: what current legal measures are available to regulate content disseminated through inauthentic online behaviour, and what additional recommendations or tools can assist in combating various forms of online disinformation?
FREE AND FAIR ELECTIONS AND THE ROLE OF SOCIAL MEDIA
Effective democratic governance is fundamentally premised on electoral processes that are free and fair. Robert Dahl, the political scientist who coined the term “free and fair elections,” defines “freedom” as the right and opportunity [of all those entitled to vote] to be able to choose one over another without fear of coercion or intimidation, and “fairness” as encapsulating impartiality, underscored by the unbiased application of rules and reasonableness in the allocation of resources for political parties. This definition is also generally accepted within South Africa’s context.
The increased dissemination of information online has enhanced the power of social media and made digital platforms into effective tools of political persuasion. Social media has positively affected democracy by bolstering voter turnout and maximising the impact of social media campaigns by facilitating direct engagement with political candidates through the use of cost-effective technologies on different social media platforms. However, increased social media engagements have also furthered the dissemination of election-related disinformation, including through inauthentic online behaviour.
THE EMERGENCE OF CYBER TROOPS AND DIGITAL INFLUENCERS IN SOUTH AFRICA
Cyber-troops have become increasingly active in South Africa in recent years. They usually consist of a combination of both human operators and automated bots and have played a pivotal role in in electoral contexts around the world. In South Africa, cyber-troops are predominantly used by politicians and political parties to drive disinformation campaigns through the strategic use of emotional manipulation or narrative shaping, otherwise known as valence techniques. This encompasses spreading pro-government or pro-party propaganda, launching attacks on political opponents, orchestrating smear campaigns, diverting conversations away from crucial issues, and fomenting divisions among citizens. Unlike disinformation, misinformation, and mal-information, cyber troops operate within organised structures with official support. Their primary objective is to manipulate perceptions through emotional appeals and narrative control, and they are integral to larger political or governmental strategies.
The use of these valence techniques was demonstrated in the lead-up to the 2021 local government elections. Together with a range of individual influencers, both known and unknown, political parties played a central role in advancing emotional narratives grounded in anti-immigrant sentiment. ActionSA’s messaging reportedly adopted a dual narrative approach that focused on the rule of law and service delivery versus the portrayal of migrants. Similarly, the ATM, often linked to the African National Congress’ Radical Economic Transformation (RET) faction, and an early advocate of the #PutSouthAfricansFirst project, continued referencing the hashtag in its Facebook content and leveraging emotive and polarising content. It is notable that stereotyping and the vilification of migrants predominantly occurred through anonymous Twitter accounts.
Research has demonstrated how the use of anonymous influencers’ content also assisted these parties to gain high levels of engagement across various social media platforms. Digital influencers offer their services to political actors and typically earn money for services such as creating hashtags, crafting online identities, and employing inauthentic engagement techniques to amplify their reach. Usually, they are real people with an existing following, but cases have also been documented of “bot influencers” being used. For example, the infamous “Guptabots” campaign from 2016 to 2017 utilised hundreds of fake accounts controlled by a team of individuals to bolster perceptions of the Gupta family and the ruling African National Congress (ANC)’s Radical Economic Transformation (RET) faction while attacking critics. This campaign employed both ‘sock puppet’ accounts — several accounts held by one person pretending to be multiple — and bots — automated accounts with no real user behind them — to manipulate social media discourse.
The content moderation of different forms of content on social media platforms is predominantly conducted using algorithmic tools such as automated filters and machine learning models. While these tools are efficient in processing large amounts of content, they fall short of adequately addressing nuances. The lack of local staff, language capability, and the challenge of scalability further contribute to the inadequacies of these algorithmic tools in effectively addressing content disseminated by cyber troops in South Africa.
GAPS IN LEGISLATION
While freedom of expression is a fundamental right, it is not absolute and is subject to certain limitations as defined in both international and domestic law. South Africa has implemented some legal measures aimed at ensuring the responsible exercise of this right, particularly during election periods. These measures include section 89(2) of the Electoral Act 73 of 1998 and section 69(2) of the Local Government: Municipal Electoral Act 27 of 2000 which prohibit the dissemination of false information with the intention of, amongst others, influencing the conduct and outcome of an election. Additionally, the Electoral Code of Conduct under the Electoral Act prohibits registered political parties or candidates from disseminating false or defamatory information concerning elections and from generally abusing their position of power, privilege or influence to affect the outcome of an election. Government communicators, public servants, and state-financed media are also prohibited, through various regulations, from disseminating information that promotes or prejudices one political party over another.
While these provisions broadly establish the prohibition of disinformation, several uncertainties persist. It is unclear whether these safeguards extend to the dissemination of partisan messaging that may not be inherently false but has the potential to mislead online users. Additionally, the phrasing regarding political parties and candidates “generally abusing their position of power” is also overly broad with little guidance to assist in determining what conduct would amount to one abusing their position of power. Lastly, there are unresolved concerns about monitoring, tracking, and enforcing regulations against removing narrative-focused content by cyber troops, including the roles and responsibilities of digital platforms in removing such content.
RECOMMENDATIONS AND CONCLUSION
In preparation for South Africa’s upcoming elections, it is evident that algorithmic tools, though valuable, cannot solely bear the responsibility of detecting the intricacies of influence campaigns and acting against them. Human expertise remains essential in identifying the nuanced strategies employed in these campaigns, especially those designed to cause division.
To effectively combat the impact of computational propaganda, the implementation of algorithmic regulation tools should be closely aligned with legislation and regulation and supported by the Independent Electoral Commission. The Commission, working in collaboration with civil society, possesses a more comprehensive understanding of the country’s election landscape and the underlying issues, such as xenophobia, that may be exploited in favour of specific electoral outcomes.
Notably, the Commission has collaborated with Google, Meta, TikTok, and Media Monitoring Africa (MMA) to promote access to accurate information, raise awareness of the harms of disinformation, and provide training to political parties, election candidates, and other stakeholders involved in the electoral process. It has also worked with MMA, a media ethics and freedom watchdog, to develop the Real411 tool which aims to empower the public to report instances of online disinformation, hate speech, incitement, and harassment. These reports are then scrutinised, flagged, and reported to the platforms for removal. Further, Africa Check, the continent’s first independent fact-checking organisation, scrutinises public statements, analyses the most credible evidence, and issues fact-checking reports to inform public discourse. Beyond fact-checking, it
While initiatives such as the Real411 and fact-checking significantly contribute to combating false information, there remains a need for more robust measures to ensure effective monitoring, enforcement, and swift responses against those violating online regulations and operating in the grey area of influence and manipulation online. Prioritising digital literacy training, improving contextualised content moderation, and advocating for greater transparency from tech platforms regarding their content moderation processes are equally crucial. The UNESCO Guidelines for Internet Governance emphasise the importance of a multistakeholder and human rights-based approach and stress the need for complementary self-regulatory, co-regulatory, and statutory regulatory arrangements among states, intergovernmental organisations, civil society, and media, among others.
Importantly, offline solutions also play a crucial role in addressing the societal issues that influence digital political manipulation. Even small-scale issues can have deeply polarising effects, making it difficult to tackle the full spectrum of harmful online information at scale. Consequently, a systematic approach is required to address the societal issues that inform the narratives perpetuated by cyber troops.
+-*#*-+
* Phenyo is a Tech Rights fellow at ALT Advisory working at the intersection of human rights and digital technologies in the African region.
ENDS.