Digital Disinformation Is a Threat to Public Health

Austin McNeill Brown

DOWNLOAD THE BRIEF [PDF]
View our other research briefs.

As the death toll from COVID-19 surges past half a million worldwide and over 10 million people have become infected, it is clear this crisis is far from over. National leaders have refused to acknowledge the seriousness of the issue and neglected to act on behalf of the people they govern and serve.

In the U.S., the lack of a comprehensive and science-based response has led to poor health and economic outcomes. Healthcare disinformation has driven attitudes of minimization and denial of the pandemic crisis and is an existential threat to the welfare of the population.

A recent study by researchers at Carnegie Melon University (CMU) highlights a disturbing trend in social media. Researchers have determined that nearly half of all the Twitter accounts promoting the reopening of America were likely bots.1 The bots are one element of coordinated and partially automated disinformation campaigns, which may be responsible for achieving political agendas and sowing divide. Bots can be deployed in staggering numbers to promote conspiracy theories with disturbing real-world consequences, like heavily armed American citizens protesting health and safety measures meant to protect them. The ongoing coronavirus pandemic highlights the susceptibility of a population under duress, particularly regarding health and science communication efforts. This pandemic calls attention to the urgent need to directly combat public health disinformation.

Twitter Bots, #Reopenamerica, and the Coordinated Disinformation Network
Beginning in January, CMU researchers analyzed 200 million tweets discussing COVID-19 and found 82% of the top 50 most influential “tweeps” (persons who tweet) are bots. Out of the top 1000 retweets, 62% were shared by bots. Bots are identified by the speed at which they operate and their “movement,” tweeting from one location and again moments later from the other side of the globe. Bots can also be spotted when many accounts tweet similar messages simultaneously, often sharing a hashtag, such as #reopenamerica. These Twitter accounts may be operated by a completely automated bot, which accounted for 34% of the “reopen America” subset. The other 66% of the “reopen America” tweets in question were posted by accounts possibly operated by humans with bot assistants. Dr. Kathleen Carley and her team, which has monitored digital disinformation for some time, note the dramatic increase in bot numbers. During election cycles and other political events, between 10-20% of the disinformation is pushed by bots. These latest findings indicate a significant coordinated increase in disinformation efforts.2,3

The Psychopolitics of Digital Information
Understanding psychopolitical philosophy and how it applies to digital space is an important addition to the public health toolbox. Psychopolitics is defined as the political administration of psychological power, particularly through digital and social media.4 The mechanisms of psychopolitical control operate under the pretense of collective digital space, giving users the illusion of freedom within a corporate-owned and data-generating entity. This is the subtlety of psychopolitical power. The connectivity between people and information is secondary to the main function of social media, which is to generate data that can be bought and sold by corporations and provide advertising revenue.

Due to the coronavirus and social distancing measures, social media is currently the simplest and safest way for people to stay in touch with each other and the outside world. Like our currently polarized society, the digital social media space has also become increasingly partisan. In the last decade, the addition of the News Feed feature to Facebook, and the ability to “share” news and information quickly has accelerated the political nature of social media. Today, Facebook is a primary information source for millions of Americans. Such digital corporate spaces become a nexus for psychopolitical power, wielded over millions of people. This influence is powerful enough to drastically alter behavior, sway voting blocks, radicalize dissidents, and incite political activism.5  

Public Health Promotion in the Digital Space
Public health promotion relies on the widespread dissemination of valid and accessible information, particularly during time-sensitive outbreaks. The public’s perception of health officials often makes a crucial difference in outcomes. There have been striking successes, like the campaign against tobacco use.7 There have also been striking failures, including the spread of misinformation regarding HIV/AIDS, efforts to undermine sexual health and pregnancy prevention, and harm reduction strategies for people with substance use disorders.8,9 With the rise of social media and shared digital spaces, officials have a pivotal obligation to promote factual content during the current public health crisis. At this moment in history, digital public health messaging faces substantial and meaningful challenges that will shape the future of public health for years to come. Social and behavioral science must inform effective digital health promotion that can address this threat, guide public behavior, and mitigate prejudice while delivering essential information.11

Disinformation Is a Public Health Threat and Public Health Experts Must Treat It as Such
The abuse of psychopolitical power has become increasingly clear, as national leadership engages in misinformation, minimization of public health dangers, and sometimes flat out lies. The push to “reopen America,” despite direct objections from health officials, has undone efforts to curb the spread of the virus, with infection rates rapidly climbing out of control.18 At the same time, the rejection of basic public health measures, like wearing a mask in public, has become a political statement.19 Each of these elements has undermined any centralized response to the health crisis at a time when the flow of apolitical and factual information is of critical importance.

Given the existing political environment related to the COVID-19 pandemic, public health researchers must take it upon themselves to address the philosophical, psychological, and behavioral implications of the threat disinformation poses for public health. There is a long scientific history of disinformation tactics, and it should be incorporated into general public health knowledge and training. Public health researchers must not confuse deliberate disinformation campaigns with willful ignorance, or political partisanship. While many public health researchers began their careers with little intention of getting into the finer points of espionage, psyops, and digital technology, it is now clear that public health experts must familiarize themselves with these strategies. There are broad and well-studied methods for identifying and countering deliberate disinformation.12,13,14,15,16,17

Public health researchers often collaborate with scientists, politicians, organizations, public health systems, and medical experts. The COVID-19 pandemic highlights the need to expand such collaborative efforts beyond traditional affiliations. Digital space experts, technical scientists, data programmers, and even intelligence organizations may provide knowledge and techniques to combat online health disinformation campaigns. Freedom of speech is a powerful institution that should be safeguarded. By confronting disinformation, public health officials honor the commitment to free speech and the dissemination of evidence-based health information.

References   

  1. Alvino Young, V. (2020, May 20). Nearly half of the Twitter accounts discussing ‘reopening America’ may be bots. Carnegie Mellon University. https://www.cs.cmu.edu/news/nearly-half-twitter-accounts-discussing-%E2%80%98reopening-america%E2%80%99-may-be-bots
  2. Hao, K. (2020, May 21). Nearly half of Twitter accounts pushing to reopen America may be bots. MIT Technology Review. https://www.technologyreview.com/2020/05/21/1002105/covid-bot-twitter-accounts-push-to-reopen-america/
  3. Wardle, C. (2020, September 1). Misinformation has created a new world disorder. Scientific American. https://www.scientificamerican.com/article/misinformation-has-created-a-new-world-disorder/
  4. Han, B. C. (2017). Psychopolitics: Neoliberalism and new technologies of power. Verso Books.
  5. Zhuravskaya, E., Petrova, M., & Enikolopov, R. (2020). Political effects of the internet and social media. Annual Review of Economics12.
  6. Andrade, G. (2020). Medical conspiracy theories: Cognitive science and implications for ethics. Medicine, Health Care and Philosophy, 1-14.
  7. Fox, A. M., Feng, W., & Yumkham, R. (2017). State political ideology, policies, and health behaviors: The case of tobacco. Social Science & Medicine, 181, 139-147.
  8. Noah, L. (2011). Truth or consequences: Commercial free speech vs. public health promotion (at the FDA). Health Matrix: Journal of Law-Medicine, 21(1), 31-96.
  9. Healton, C. (2008). Keynote speech on the application of harm reduction to other public health problems: What is similar or different about the issue of tobacco. Journal of Health Care Law & Policy 11(1), 93-102.
  10. Moynihan, D. P. (2020). Populism and the Deep State: The Attack on Public Service Under Trump. SSRN. Available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3607309.
  11. Van Bavel, J. J., Baicker, K., Boggio, P. S., Capraro, V., Cichocka, A., Cikara, M., … & Drury, J. (2020). Using social and behavioural science to support COVID-19 pandemic response. Nature Human Behaviour, 1-12.
  12. Horne, B. D., & Adali, S. (2017, May). This just in: fake news packs a lot in title, uses simpler, repetitive content in text body, more similar to satire than real news. In Eleventh International AAAI Conference on Web and Social Media.
  13. Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic literature review on the spread of health-related misinformation on social media. Social Science & Medicine (1982), 240, 112552. https://doi.org/10.1016/j.socscimed.2019.112552
  14. Gill, R., van de Kuijt, J., Rosell, M., & Johannson, R. 24th International Command and Control Research and Technology Symposium ‘Managing Cyber Risk to Mission’ Disinformation in the Cyber Domain: Detection, Impact, and Counter-Strategies.
  15. Hounsel, A., Holland, J., Kaiser, B., Borgolte, K., Feamster, N., & Mayer, J. (2020). Supporting Early and Scalable Discovery of Disinformation Websites. arXiv preprint arXiv:2003.07684.
  16. McKee, M., & Middleton, J. (2019). Information wars: tackling the threat from disinformation on vaccines. BMJ, 365, I2144.
  17. De Zen, A. (2020). A Misinformation Blueprint: Mapping warnings in an agile communication system [Doctoral dissertation, OCAD University]. OCAD University Open Research Repository. http://openresearch.ocadu.ca/id/eprint/3072/.
  18. Powell, A. (2020, June 26). Pandemic threatens to veer out of control in U.S. The Harvard Gazette. https://news.harvard.edu/gazette/story/2020/06/pandemic-threatens-to-veer-out-of-control-in-u-s/
  19. Mitchell, A., Jurkowitz, M., Baxter Oliphant, J., & Shearer, E. (2020, May 20). Americans who rely most on white house for COVID-19 news more likely to downplay the pandemic. Pew Research Center. https://www.journalism.org/2020/05/20/americans-who-rely-most-on-white-house-for-covid-19-news-more-likely-to-downplay-the-pandemic/

Acknowledgments
The author would like to thank Dr. Shannon Monnat and Megan Ray for helpful edits on an earlier version of this brief and Lerner Center staff for publication and dissemination efforts.

About the Author
Austin McNeill Brown is a PhD student in the Social Science Program and a Graduate Research Affiliate with the Lerner Center for Public Health Promotion in the Maxwell School of Citizenship and Public Affairs at Syracuse University.