November 28, 2021

seasidevillage

Automotive maniacs

How biases fuel misinformation and disinformation, By Nkem Agunwa

10 min read

There is no magic bullet for eliminating mis/disinformation…The solution must be multi-pronged and cross-cutting. As Julia Koller brilliantly puts it: “Information is only as reliable as the people who are receiving it. If readers do not change or improve their ability to seek out and identify reliable information sources, the information environment will not improve”. To improve the information ecosystem, we must adopt some healthy digital behaviours, and this includes keeping our biases in check.

Misinformation and disinformation are arguably as old as humanity. The popular myth that bats are blind is scientifically inaccurate, and thus a misinformation. Research has proven that bats are anything but blind. They see in black and white, and at night they have the potential to see even better than humans. The introduction of information technology, particularly social media, has undoubtedly magnified the impact of mis/disinformation. It has opened up the space for an exponential spread of unchecked information at lightning speed, thereby challenging truth and reality, stirring division and conflict, breeding a skeptical public, threatening democracies and contributing to human rights violations. 

Mis/disinformation in Africa

The toxic polarisation of recent elections across the world and the rapid spread of false information about the COVID-19 virus have demonstrated the harmful potential of mis/disinformation. People have lost loved ones, experienced severe health complications, been excluded, ostracised, attacked, harassed and discriminated against as a result of the proliferation of mis/disinformation.

In Nigeria, there was the pervading belief that Nigerians were immune to the COVID-19 virus. This originated from the false information that blacks possess a natural protective response to SARS-CoV-2. Furthermore, the hysteria around hydroxychloroquine as a cure for COVID-19, resulting from the unsubstantiated claim in the viral video by the Nigerian trained U.S. physician, Dr. Stella Emmanuel, led to the panic purchases and stockpiling of the drug. Despite extensive media coverage to debunk these claims, people still relied on this misleading information, resulting in chloroquine poisoning and the refusal to comply with COVID-19 protocols, thereby increasing the risk of exposure to the virus. Also, the conspiracy theory that 5G is linked to the virus led to widespread panic and attack on 5G infrastructures in countries like South Africa.

In the aftermath of the 2017 general elections in Kenya, mis/disinformation played a significant role in escalating the post-election violence in the country. The opposition coalition claimed that 100 people had been killed by police officers, while official reports from the Kenya National Commission on Human Rights placed the death toll at 24. This contributed to the heightened political tension in the country.

Countries in Africa are actively pushing back against mis/disinformation. Individuals and groups across continents have launched campaigns to counter the myths around the COVID-19 virus and dispel misleading narratives. An interesting example is Sarah Dawns, a South African also known as Mistress of Science, who fights medical misinformation on social media. The Centres for Disease Control in most African countries provide frequent updates on the number of confirmed cases, successful treatments and deaths related to the virus in the respective countries. Fojo and Pax Press, along with AfricaCheck, are building the capacity of journalists in Rwanda on fact checking, verification and cyber security. These are timely and innovative efforts on the continent that help fight mis/disinformation. However, the growing number of mobile phone users in Africa, coupled with the declining trust in the mainstream media and the widespread use of encrypted messaging applications like WhatsApp, make these interventions grossly insufficient to surmount the evident challenge involved. Furthermore, research shows a high exposure to perceived misinformation in some African countries and a heightened propensity of citizens to intentionally share false information.

Layers of Bias

What motivates people to share misleading information? The spread of misinformation plays to the natural instinct of humans to gravitate towards information that is familiar and aligns with preexisting beliefs. This sometimes shows up in the form of bias, which is influenced by our experiences, education, values and preferences, among other factors that shape our world view.

A 2018 study on the spread of true and false news online found that false news spreads significantly farther, faster, deeper, and more broadly than the truth by a substantial margin. Evidently, bots share the digital space with humans and are programmed to disseminate inaccurate news. In this particular study, when all bots were removed from the dataset, the differences between the spread of false and true news remained the same – lending credence to the hypothesis that humans, not bots, propel the spread of misinformation.

The Internet presents an endless web of information that could overwhelm users with irrelevant data. In a bid to create tailored content that meets the information needs of individual users, algorithms selectively guess what information a user would find useful or interesting, on the basis of their digital footprints. Consequently, users are consistently fed with personalised content that align with their beliefs, desires and motivations.

The Internet has presented a plethora of information sources projecting news stories in nanoseconds, all competing for the attention of the user. This creates information overload that makes it impossible for the user to effectively assimilate the multiplicity of online content. In order to make sense of it, the human brain develops heuristic techniques that help it make quick and effective analysis of information. Unfortunately, these techniques are laced with layers of bias that make the user vulnerable to mis/disinformation and provide fertile ground for its spread.

Cognitive Bias

Increasingly, content creators are becoming more adept at grabbing the attention of users by deploying a variety of tricks that makes it easier for users to respond more favourably to their content, having understood their cognitive biases. These cognitive biases include confirmation bias, conformity effect, overconfidence, availability bias, naive realism, etc. Users are more likely to resonate with information that confirms their prior beliefs and values and discard or challenge any information to the contrary. They would be more inclined to promote content that affirms their identity, status, and religious beliefs or ideology, even though the content might be misleading.

TEXEM

For example, the xenophobic attacks in South Africa were fueled by the claims that the influx of foreigners had created unemployment for citizens, overburdened sectors such as health care and caused insecurity in the country. This resulted in the violent assault of foreigners, as it resonated with the belief held by many South Africans that foreigners are a threat to their identity and safety, given their history of discrimination under the apartheid regime. A joint report by the OECD Development Centre and the International Labour Organisation (ILO) reveals that there is no significant effect of immigrant workers on the employment of the native-born at the national level. In fact, immigrants in South Africa have a positive net impact on the government’s fiscal balance because they tend to pay more in taxes.

In Nigeria, the vicious conflict between the Fulani pastoralists and the predominantly Christian Indigenes that has long ravaged southern Kaduna, an area in northern Nigeria, has significantly been fuelled by emotionally charged narratives and exaggerated claims. This has led to human and material losses on both sides. Whereas, the underlying cause of this conflict is the scramble for limited land, and other resources exacerbated by climate change and the government’s failure to protect the people.

Societal Bias

The need for validation is a major motivation to identify with a community, and feel a sense of belonging and acceptance. People tend to gravitate more towards communities that share their values and ideologies and as a result, the information flow within such circles is rarely challenged because of the common belief in each other’s credibility and the fear of ostracisation. This creates an echo-chamberwith limited room for divergent views, making it difficult for misinformation to be detected. This vicious circle sows division within the larger society because people are existing in silos and are uninformed about the nuance of other groups and communities. This is a slippery slope into an ‘us versus them’ scenario, which tears communities apart.

Furthermore, there is the tendency to assume that trending or popular opinions are accurate. Oftentimes, we have seen tweets with massive engagements of influential figures being taken as fact because they have the approval of a community, thereby creating a false social consensus, or clout. These are societal conditionings and trappings that are ripe for manipulated content to percolate, tear down and fracture communities.

Technology-Aided Bias

The Internet presents an endless web of information that could overwhelm users with irrelevant data. In a bid to create tailored content that meets the information needs of individual users, algorithms selectively guess what information a user would find useful or interesting, on the basis of their digital footprints. Consequently, users are consistently fed with personalised content that align with their beliefs, desires and motivations. This reinforces their existing biases and further isolates them from divergent views and ideologies, thereby creating a condition that Eli Pariser describes as a filter bubble.

Biases are a blind spot, and it is important we open ourselves up to information that may not necessarily align with our held beliefs and ideologies. The golden rule is to pause before sharing any content. Also, it is important to engage in preliminary, own investigation that would help you reach an objective conclusion, such as: Seeking out different opinions, hearing others out, questioning the source…

This bias in machines makes users more vulnerable to conspiracy theories, hyper-partisan content and outright fabrication. The filter applied using algorithms leads to an inundated exposure of monolithic information that has negative effects on civic debates. Furthermore, on the basis of frequency algorithms alone, popular content, regardless of its quality or accuracy, is projected to users in an unsolicited manner, capturing their attention. This is particularly dangerous because the system can be manipulated or gamed by anyone with nefarious intent to spread mis/disinformation.

The WITNESS Approach to Combating Mis/Disinformation

Over the last three years, WITNESS, an international non-governmental organisation, has provided recommendations to technology platforms to help make users less vulnerable to mis/disinformation. Technology platforms must understand the responsibility they owe society as the primary gatekeepers of information flows, to ensure that information is not just relevant but important, challenging, and holds other viewpoints. The responses of technology platforms to the spread of mis/disinformation during the pandemic has shown the benefit in promoting accurate information in-platform and helping users find and recognise them. This has to be scaled up to include all kinds of information and not just COVID-19 specific information. This should be done in addition to other steps taken to increase their commitment to more resourcing in the global south, as well as using a human rights based approach in their policies.

WITNESS has also embarked on a series of workshops in the global south aimed at tackling the threats posed by artificial intelligence-enabled forms of media manipulation. We have identified media literacy as one of the effective ways of combating mis/disinformation. This was one of the findings from our deepfakes convention in Sub-Saharan Africa, organised in collaboration with the Centre for Human Rights, University of Pretoria. This is why we are currently embarking on a project for West Africa that seeks to identify  the best ways for reducing the harm caused by mis/disinformation, especially to grassroots communities.

Other initiatives being embarked upon by WITNESS and our partners include: the drafting of an access protocol for deepfakes detection tools; contribution to the development of a responsible, human rights respecting authenticity infrastructure and the development of user-friendly verification and accountability tools. Also, WITNESS developed a set of criteria through which the wide range of possible responses that technology platforms adopted in fighting COVID-19-related mis/disinformation can be assessed, using a framework based on human rights, our experience working with marginalised communities, and human rights defenders globally.

Pause Before Sharing

Biases are a blind spot, and it is important we open ourselves up to information that may not necessarily align with our held beliefs and ideologies. The golden rule is to pause before sharing any content. Also, it is important to engage in preliminary, own investigation that would help you reach an objective conclusion, such as: Seeking out different opinions, hearing others out, questioning the source, and corroborating with other credible media sources. This is known as the SIFT method.

In navigating through the algorithmic biases of websites and social media platforms to ensure balanced information, there are some strategies, though limited, that can be helpful, such as: Clearing out your search entries regularly; using multiple search engines for your queries; concealing your IP address where possible, and more importantly, actively seeking out trustworthy materials that hold divergent viewpoints.

There is no magic bullet for eliminating mis/disinformation. Human rights movements will always have adversaries that infiltrate and sabotage futures free of deception. The solution must be multi-pronged and cross-cutting. As Julia Koller brilliantly puts it: “Information is only as reliable as the people who are receiving it. If readers do not change or improve their ability to seek out and identify reliable information sources, the information environment will not improve”. To improve the information ecosystem, we must adopt some healthy digital behaviours, and this includes keeping our biases in check.

Nkem Agunwa is the Project Coordinator Africa at WITNESS.

Support PREMIUM TIMES’ journalism of integrity and credibility

Good journalism costs a lot of money. Yet only good journalism can ensure the possibility of a good society, an accountable democracy, and a transparent government.

For continued free access to the best investigative journalism in the country we ask you to consider making a modest support to this noble endeavour.

By contributing to PREMIUM TIMES, you are helping to sustain a journalism of relevance and ensuring it remains free and available to all.

Donate


TEXT AD: To advertise here . Call Willie +2347088095401…





PT Mag Campaign AD

http://seasidevillage.info © All rights reserved. | Newsphere by AF themes.