Big Nudging and Misinformation in the Era of COVID-19
There are many worries about the Information Age, or Misinformation Age in which we find ourselves, and how living in the digital world is driving us further away from democracy and self-determination. In my last post, I introduced neo-colonialism, which is enforced through data colonialism and digital colonialism, and in this post, I will give a review of these terms as a precursor for discussing how Big Nudging and misinformation in the Era of COVID-19 is having an effect on our free-will. However, I argue that if we can become aware of these things and work together, perhaps we can move toward democracy and not away from it. To do this, we can take some tips from the US Surgeon General, which I review below.
Data Mining and Big Nudging Help to Spread Misinformation
Data mining is a term used to describe the act of data collection in a manner that is reminiscent of colonialism. Data colonialism is when Big Data is collected and used as a way to control or manipulate populations. (Couldry and Mejias, 2019) Digital colonialism is a parallel term that covers the use of digital technologies being used for social, economic, and political domination. (Kwet, 2021)
Big Nudging could be considered Data colonialism in action, although who is holding the reins in the seat of power is not always clear. Is Big Nudging merely a tool that can be used for control, or can it also be used for good?
The concept of nudging is akin to ‘influence with an agenda’ when external forces influence individual or group behaviors and decisions. Nudge theory was first made popular by Richard Thaler, a behavioral economist, and Cass Sunstein, a political scientist. Nudging coaxes behavior without forcing, tweaking the environments in which we make decisions by utilizing insights about our mental processes, and can be used on family, say to remind a loved one to take their daily medicine, or on a larger scale, by requiring people to opt-out of organ donation as opposed to opting-in. The idea is that we still have the choice, without any economical or other incentives, and without forced mandates. (Yates, 2020) When this psychological tool relies on Big Data, it is called Big Nudging. This can be subtle and dangerous when people are unaware that they are being nudged, believing wrongly that they are acting within their own free will.
Political campaigners are massive culprits in this, combining profiling with big nudging to target which demographic groups individuals belong to, gathering data on what issues are most significant in order to procure support for their propositions. Big nudging has been strongly suspected to be used in many large political campaigns, such as Brexit and the 2016 US presidential election. (Wong, 2019)
“The term “big nudging” has emerged to represent using big data and AI to exploit psychological weaknesses to steer decisions — creating problems such as damaging social cohesion, democratic principles, and even human rights”. (Vinuesa et al, 2020 P3)
Big Nudging plays on our emotions, and works almost too well, especially with spreading misinformation. This may explain why one study found that false news stories were 70% more likely to be shared than true stories (Vosoughi et al. 2018), and why they often go viral. During the pandemic, nudging has been used alongside mandates for things like mask-wearing and social distancing, with varying results. (Dudás & Szántó, 2021) Some efforts were indeed used for good, such as handwashing campaigns, however, the threats of Big Nudging spreading misinformation appear to outweigh the benefits.
What can be done about Misinformation for COVID-19?
Recently, the Surgeon General of the United States, Dr. Vivek H. Murthy, put out a report on the dangers of misinformation about COVID-19 as a public health concern. As the next step, Murthy has put out a request for information “. . . on the impact and prevalence of health misinformation in the digital information environment during the COVID-19 pandemic.” (Lesko, 2022).
In the report, Murthy listed several reasons for the rapid spread of misinformation, as well as calls to action for a whole-of-society effort to combat misinformation for the pandemic and beyond. This is extremely useful and could help to curb Big Nudging on multiple fronts.
Here are the reasons misinformation tends to spread so quickly on online platforms:
The emotional and sensational nature heightens psychological responses like anxiety and produces a sense of urgency to react and share.
Incentivization for likes, comments, etc. rewards engagement over accuracy.
Popularity and similarity to previous content are favored by algorithms, which can cause confusion and reinforce misunderstanding. (Murthy, 2021)
Distrust of the government and/or the healthcare system can further cause misinformation to flourish. It is especially prevalent in areas of significant societal division and political polarization, and for those who have experienced racism or other inequities, misinformation can spread even easier. (Murthy, 2021)
The US healthcare system is privatized and has shown bias for socioeconomic status and against minorities, so it is not difficult to understand people’s mistrust in it, however, the over-reliance on emotionally-charged misinformation leaves everyone confused and not knowing what to trust or believe. A recent analysis found that a widely used algorithm in US hospitals that helps manage the care of about 200 million people each year has been systemically discriminating against black people. However, by making changes to find other variables besides healthcare costs to calculate individual medical needs, biases were reduced by 84%. This shows that more diversity is needed in algorithm design teams, and more testing needs to be done before using these algorithms in people’s lives. (Ledford, 2019)
How can we address health misinformation, and hopefully prevent misinformation in other spheres going forward?
The Surgeon General listed some recommendations for taking action:
Equip Americans with the tools to identify misinformation, make informed choices about what information they share, and address health misinformation in their communities, in partnership with trusted local leaders.
Expand research that deepens our understanding of health misinformation, including how it spreads and evolves; how and why it impacts people; who are most susceptible; and which strategies are most effective in addressing it.
Implement product design and policy changes on technology platforms to slow the spread of misinformation.
Invest in longer-term efforts to build resilience against health misinformation, such as media, science, digital, data, and health literacy programs and training for health practitioners, journalists, librarians, and others.
Convene federal, state, local, territorial, tribal, private, nonprofit, and research partners to explore the impact of health misinformation, identify best practices to prevent and address it, issue recommendations, and find common ground on difficult questions, including appropriate legal and regulatory measures that address health misinformation while protecting user privacy and freedom of expression (Murthy, 2021)
The US Surgeon General provided many tips for healthcare workers, educators, journalists, tech companies, governments, and the public on how to combat health misinformation, including an emphasis on creating resilience to misinformation. (Murthy, 2021) Misinformation exists independently of colonialism in all of its forms, yet has been used as a tool to keep people controlled and to nudge people towards decisions that feed systems of control. These systems have been adopted by the algorithms that direct what we see online, and our own emotions do the rest of the work.
My question is this: can we apply Dr. Murthy’s advice in order to decolonize ourselves and the digital world, by building resistance to misinformation and Big Nudging and truly making our own democratic decisions for the pandemic and in the future? Can we learn from all of this and move forward stronger, armed with the knowledge that systems that are made to benefit people but are built like a business, such as the US healthcare system, are not working for us, and democratically call for better systems that truly serve all people? If we can figure out how to combat misinformation and Big Nudging, perhaps we can move toward democracy and not away from it, but to do that we must educate ourselves and be able to recognize what is false and what is manipulative and call it out, shut it out, and move on.
You can stay up to date with Accel.AI; workshops, research, and social impact initiatives through our website, mailing list, meetup group, Twitter, and Facebook.
Join us in driving #AI for #SocialImpact initiatives around the world!
References
Couldry, N., & Mejias, U. A. (2019). Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject. Television & New Media, 20(4), 336–349. https://doi.org/10.1177/1527476418796632
Dudás, L., & Szántó, R. (2021). Nudging in the time of coronavirus? comparing public support for soft and hard preventive measures, highlighting the role of risk perception and experience. PLOS ONE, 16(8). https://doi.org/10.1371/journal.pone.0256241
Gramacho, W., Turgeon, M., Kennedy, J., Stabile, M., & Mundim, P. S. (2021). Political Preferences, Knowledge, and Misinformation About COVID-19: The Case of Brazil. Frontiers in Political Science, 3. https://doi.org/10.3389/fpos.2021.646430
Kwet, M. (2019). Digital colonialism: US empire and the new imperialism in the Global South. Race & Class, 60(4), 3–26. https://doi.org/10.1177/0306396818823172
Ledford, H. (2019). Millions of black people affected by racial bias in health-care algorithms. Nature, 574(7780), 608–609. https://doi.org/10.1038/d41586-019-03228-6
Lesko, M. (2022). Impact of health misinformation in the digital information … hhs.gov. Retrieved March 10, 2022, from https://www.federalregister.gov/documents/2022/03/07/2022-04777/impact-of-health-misinformation-in-the-digital-information-environment-in-the-united-states
Lucero, V. 2022. From CTA/CTT to voter tracing? risk of data misuse in the Philippines. (February 2022). Retrieved February 16, 2022 from https://engagemedia.org/2022/philippines-contact-voter-tracing/
Murthy, V. H. (2021). Confronting health misinformation — hhs.gov. hhs.gov. Retrieved March 10, 2022, from https://www.hhs.gov/sites/default/files/surgeon-general-misinformation-advisory.pdf
Vinuesa, R., Azizpour, H., Leite, I., Balaam, M., Dignum, V., Domisch, S., Felländer, A., Langhans, S. D., Tegmark, M., & Fuso Nerini, F. (2020). The role of artificial intelligence in achieving the Sustainable Development Goals. Nature Communications, 11(1), 233. https://doi.org/10.1038/s41467-019-14108-y
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359, 1146–1151. http://doi.org/10.1126/science.aap9559
Wong, S. 2019. Filter bubbles and big nudging: Impact on Data Privacy and Civil Society. (September 2019). Retrieved February 22, 2022 from http://www.hk-lawyer.org/content/filter-bubbles-and-big-nudging-impact-data-privacy-and-civil-society#:~:text=Similar%20to%20filter%20bubbles%2C%20big%20nudging%20also%20involves,of%20nudge%20with%20the%20use%20of%20Big%20Data.
Yates, T. (2020, March 13). Why is the government relying on nudge theory to fight coronavirus? . The Guardian. Retrieved March 12, 2022, from https://www.theguardian.com/commentisfree/2020/mar/13/why-is-the-government-relying-on-nudge-theory-to-tackle-coronavirus
10