The Top 10 SciComm Research Studies of 2022
As the end of 2022 approaches, I bring you 10+ highly cited science communication research papers from the year!
I sourced these papers from Web of Science and Google Scholar, by searching for the terms “science communication,” “science misinformation,” and “risk communication.” (I also got banned from Google Scholar in the process by doing too many searchers for the same terms over and over!) I considered papers relevant to science communication published in 2022 with at least 5 citations.
I’ve summarized the top 10 highly cited papers below. But before you read on - check out this blog posts’s featured visual! This is a collaboration between myself and the amazing science and medical illustrator Cassandra Tyson. We brainstormed ways to communicate something that all of the papers I’ve summarized below have in common.
What comes to mind when you look at this visual? Comment or tag me on social media to let me and Cassandra know! Buy Me a Coffee to hear more about the making of this visual!
1. Global COVID-19 Vaccine Acceptance: A Systematic Review of Associated Social and Behavioral Factors (29 citations)
In this study, biological engineers and computer scientists in Pakistan and the U.S. review when, where and why people accept a COVID-19 vaccine or hesitate to. They looked at more than 80 peer-reviewed reports of vaccine acceptance and hesitancy among different populations worldwide.
People in Indonesia and Brazil were most likely to accept a COVID-19 vaccine. Vaccine acceptance rates were lowest in various areas in Africa. The U.S. rates hover in the middle; around 66% of the population is fully vaccinated.
When and why do people refuse to get a COVID-19 vaccine? Choudhary Shakeel and colleagues find that trust issues, conspiracy theories and fears about the risk of vaccines are common reasons for vaccine hesitancy. People who refuse vaccines are also less educated and use social media more often than people who get vaccinated.
Misinformation about the COVID-19 vaccine abounds on social media. There are rumors that the vaccine alters one’s genes, that it causes infertility, that it contains a tracking device, and that it is unsafe. Believing these rumors can make a person hesitant to receive a vaccine.
Communication failures certainly bolster the roots of the misinformation “weed.” In countries with lower vaccine acceptance rates, the researchers found mixed messaging about vaccine effectiveness and safety and a lack of quality public education about vaccines.
What can science communicators learn from this? Well, if rumors underlie vaccine hesitancy, we should focus on communication strategies that foster trust in science and healthcare. We could also foster critical and analytical thinking to help people process what they see and hear online more critically.
2. Misinformation: susceptibility, spread, and interventions to immunize the public (20+ citations)
In this paper, social psychologist Sander van der Linden equates misinformation to a virus. It spreads person to person. Some people are also more susceptible to its “symptoms.” And the best treatment for “viral” misinformation is prevention; wiping it out once it has spread its tentacles wide is massively difficult.
“It is … helpful to think of misinformation as a viral pathogen that can infect its host, spreading rapidly from one individual to another within a given network, without the need for physical contact. One benefit of this epidemiological approach lies in the fact that early detection systems could be designed to identify, for example, superspreaders, which would allow for the timely deployment of interventions…”
People without the skills, time or attention needed to consider new information carefully can fall for misinformation. And they may find fact-checks uncompelling.
Misinformation spreads easily on social media, especially in “crowds” of loud, like-minded people. Of course, “exposure does not equal persuasion (or ‘infection’),” Lindon writes. But that’s cheap consolation when most of us will share potentially interesting headlines, even if we suspect they aren’t true.
How can we prevent the harms of misinformation? Linden assures us that debunking does often work - but debunking messages rarely reach all the people who heard the original misinformation. Debunking messages can also fail for folks with strong or long-held beliefs and emotions tied up in the misinformation.
“Just as vaccines trigger the production of antibodies to help confer immunity against future infection, the same can be achieved with information. By pre-emptively forewarning and exposing people to severely weakened doses of misinformation (coupled with strong refutations), people can cultivate cognitive resistance against future misinformation.”
So what do we do? We can warn people against specific pieces of information. We can also help people recognize rumors more generally. For example, information that seems to be trying to scare us may be misinformation. We can also encourage people to hone their own myth-busting skills - by playing video games that teach media literacy, for instance!
3. An Efficient Feedback Control Mechanism for Positive/Negative Information Spread in Online Social Networks (16 citations)
We can think of rumors, conspiracy theories and misinformation as “negative information.” This analogy would make “positive information” the truth or facts behind that negative information.
How can we slow or stop the spread of negative information on social media platforms like Facebook and Twitter? How can we do this without limiting the spread of positive information? Engineers in China and the UK tackled this challenge in a study published this year.
The researchers used a lot of differential equations that I’ll admit I don’t quite understand. But essentially, they modeled how people change their minds when they see or hear positive or negative information from others they trust.
The researchers suggest we can limit the spread of negative information by doing three things simultaneously and as early during the spread of misinformation as possible:
Warn people before they receive the rumor or misinformation. Help them to recognize negative information when they see it and to understand why and how it is false and harmful.
Refute the rumor or counter the misinformation with persuasive messages for people who already believe this negative information.
Give positive information in a compelling format to people who’ve heard the rumors or misinformation but are uncertain about what to believe.
How could you apply these three strategies in your science communication work?
“The experimental results demonstrate that our [strategies] can effectively decrease the spread of negative information.”
4. Are COVID-19 conspiracies a threat to public health? (10 citations)
Many science and health officials have wrung their hands over the surge of conspiracy theories about the SARS-CoV-2 virus. People have doubted the virus’ origins, its officially reported health impacts and potential treatments, and even its existence. But just how worried should we be about COVID-19 conspiracy theories?
In this paper, psychology researchers at the University of Essex and the University of London questioned whether COVID-19 conspiracy theories actually change believers’ health behaviors. Marie Juanchich, Miroslav Sirota, Daniel Jolles and Lilith Whiley put this idea to the test. In three separate studies conducted at different time points during the COVID-19 pandemic, the researchers investigated whether believing in a COVID-19 conspiracy theory changes someone’s likelihood of following official health guidelines. Each study involved a survey of around 300 to 400 people.
“The link between conspiracy theories and health protective behaviours is … not always negative and [it’s] difficult to explain.”
We humans aren’t likely to protect ourselves from a threat that we don’t believe is real. But not all COVID-19 conspiracy theory believers doubt that the virus is a threat. Rather, believers question information and advice provided by scientists, politicians and other authorities whom they tend to distrust.
For example, many COVID-19 conspiracy theory believers champion alternative and untested therapies while discrediting or villainizing official vaccines and testing procedures. People who believe conspiracy theories tend to distrust government, healthcare, science and political officials. If someone believes one conspiracy theory, they also likely believe another.
Juanchich, Sirota, Jolles and Whiley surveyed people who believe conspiracy theories but also still believe the virus is real. The findings may surprise you. COVID-19 conspiracy theory believers follow basic health guidelines just as strictly as anyone else does. They wash their hands, wear a mask and practice social distancing as often as anyone else.
However, conspiracy theory believers are less likely to agree to get tested for or vaccinated against COVID-19. Conspiracy theory believers tend to be paranoid and see a lot of risk in things outside their personal control. The conspiracy theory believers surveyed in this study were less likely than non-believers to use a contact-tracing app and were more likely to share any type of information about COVID-19, including misinformation.
“Conspiracy theory believers were reluctant to undertake actions for which they had lower levels of personal control, and they felt these actions were riskier and less beneficial. This highlights that the involvement of others could be perceived as a threat and echoes results linking conspiracy theory beliefs with interpersonal distrust … and paranoid tendencies…”
What stands out to me (and the authors of this study) is the importance of building relationships and trust between conspiracy theory believers and experts in science and health. We could also better choose the right messengers and consider how to help people feel more in control of their healthcare.
5. Strategies That Promote Equity in COVID-19 Vaccine Uptake for Black Communities: a Review (10 citations)
When this review was written by collaborators across the U.S. and Canada, Black people were less likely to have gotten a COVID-19 vaccine than their counterparts. As of July 2022, according to Kaiser Family Foundation data, this disparity has improved but still exists. In this paper, Debbie Dada and colleagues review strategies for encouraging Blacks to get vaccinated.
“Black and Hispanic people have been less likely than their White counterparts to have received a vaccine over the course of the vaccination rollout, but these disparities have narrowed over time and reversed for Hispanic people.” - Kaiser Family Foundation
The review covers several strategies for addressing mistrust, countering misinformation and improving access to COVID-19 vaccines. Here are some takeaways:
Address racism in the healthcare system. Establish trustworthy actors, processes, channels and locations to get COVID-19 vaccine and vaccine information to Black communities.
Create culturally-relevant content about COVID-19 vaccination.
Listen and acknowledge uncertainties.
Work with community leaders to deliver information and interventions.
Provide vaccination options in “medical deserts” via pop-up or mobile clinics, for example.
“Prior to providing information about the vaccine, public health messages and healthcare providers should acknowledge past and contemporary injustices and racism as justifiable reasons for mistrust. Messengers ought to appreciate the significance of the origin of mistrust and encourage vaccination by using autonomy-supportive, empathetic, fact-based, non-confrontational, non-coercive, and non-judgmental communication. [...] Not trusting the 'system' is a healthy response to structural racism. It’s the structural inequalities and structural racism that’s the problem, not the people.” - Dada et al.
6. Perceived Information Overload and Unverified Information Sharing on WeChat Amid the COVID-19 Pandemic (9 citations)
Misinformation was a popular research topic this year. In 2020, the World Health Organization called for efforts to flatten the “infodemic” curve, claiming that misinformation creates confusion, mistrust and risky health behaviors. Scientists and communicators have jumped to answer the call. Google searches about misinformation have increased (according to Google Trends). There’s been a flurry of related research studies.
In this study, media and communication scholars in China asked: What makes people share unverified information about COVID-19 on social media? Most people wouldn’t intentionally share false information with others. So why do so many people share information they haven’t checked for accuracy?
Perhaps people are overwhelmed with the online information ocean and don’t have the time or capacity to fact-check everything. Perhaps their fears and anxieties convince them to share uncertain information, just in case. Or perhaps people are willing to share a piece of misinformation because everyone else around them is doing it.
The researchers surveyed over 500 people in China to learn more about their information-sharing habits on WeChat. Their findings are a jab to my heart as someone with generalized anxiety.
When people report being overwhelmed by a flood of information about COVID-19, they feel anxious. This anxiety then makes people more likely to share unverified information on social media - especially if others around them are sharing it, too.
In other words, while people mostly hesitate to share uncertain information about important topics, hesitation flies out the window when people are anxious. It doesn’t help if they see lots of other people sharing uncertain information on social media.
“People who are anxious are more likely to be careless in decision-making and share unverified information (on social media).”
In sum, people share unverified information to cope with information overload and to help manage their anxiety.
“To manage anxiety and reduce emotional distress, people share information with their family, friends, co-workers, and community members to feel connected to close others (Chen et al., 2021; Lim et al., 2021).”
I wonder how science communicators could help alleviate people’s sense of being overwhelmed by information during uncertain times. Sharing more actionable information designed to help people take some personal control over risks and threats might help. We need to find ways to help people regulate their emotions online in the face of concerning and emotional content. Some solutions might include practical digital literacy training and emotional support.
7. The Backfire Effect After Correcting Misinformation Is Strongly Associated With Reliability (8 citations)
Have you ever heard that correcting a myth might solidify it in people’s minds? The principle reminds me of the common phrase, “don’t think of a pink elephant.” What are you thinking about now? Exactly - a pink elephant.
“The backfire effect is when a correction increases belief in the very misconception it is attempting to correct, and it is often used as a reason not to correct misinformation.” - Swire-Thompson et al., 2022
Some past studies have found evidence for a backfire effect in myth-busting while other studies have not. Misinformation “corrections” may backfire if they attack people’s strongly-held beliefs. Or people may have heard a myth or piece of misinformation so often (often hearing it again in the correction) that with time, they forget the correction but remember the misinformation. Believing that something is true because it sounds familiar is called the illusory truth effect.
In this study, media and psychology researchers looked closer at this hotly debated “backfire” effect.
The findings call the backfire effect into question. The researchers tested people’s beliefs in misinformation before and after issuing corrections. There was no evidence that people believed the misinformation more strongly after a correction. The researchers suggest that much evidence for backfire effects might come from measurement error or people’s uncertainty about their beliefs.
In other words, it looks like science communicators shouldn’t stop correcting misinformation. But not all corrections are created equal. Just because corrections don’t backfire doesn’t mean they work. As we’ve seen in other studies published this year, there are many reasons why a correction could fail to convince people not to believe a myth.
Here’s just a short list of reasons that a correction or “myth bust” might not work:
The “messenger” of the correction is not trusted by the audience
The correction is poorly written or is complex
The correction doesn’t account for people’s emotions or personal experiences
The correction doesn’t reach the people who need it
The correction is “aggressive” or makes people feel personally attacked or embarrassed
The correction isn’t memorable
How do you think science communicators could create more effective corrections to misinformation?
8. Predictors of Pro-environmental Beliefs, Behaviors, and Policy Support among Climate Change Skeptics (6 citations).
Living in Louisiana has led me to meet many people with paradoxical views about the environment. Many people in the “Sportsman's Paradise” are skeptical about climate change as a human-caused issue, yet spend lots of time in nature and care deeply about local ecosystems. Can we leverage their interests in nature-based recreation to help solve climate change anyway?
“Our results reveal statistically significant and consistent positive effects of (negative) environmental experiences on climate change skeptics’ environmental concerns, behaviors, and policy support.”
In this study, researchers in Idaho looked at whether and how climate skeptics’ concerns for their environment (e.g. wanting cleaner air and water) could help them support climate solutions. The researchers survey around 1,000 climate skeptics in the U.S.
Climate skeptics often hold pro-environmental concerns. The researchers found that many skeptics are concerned about plastic in the ocean, stronger forest fires and air pollution. Skeptics are likely to support climate policies framed to resonate with their concerns. They are especially likely to support pro-environmental policies if they’ve experienced negative environmental events themselves. Skeptics are likely to support renewable energy tech, regulation of air pollution and regulation of water pollution.
The researchers did find that religious ideation and belief in conspiracy theories are associated with less environmental concern and support of pro-environmental policies.
How could climate communicators help to build consensus and enact climate policy? They could start by finding shared concerns and values with climate skeptics. Leaning on shared experiences with natural disasters like fires and storms, and on general pro-environmental concerns, might help.
Some other takeaways:
Stop trying to convince skeptics of climate science. It usually doesn’t work.
Prompt people to reflect on their own environmental experiences.
Communicate about the environment in ways that elicit emotions like nostalgia and hope.
Offer opportunities for direct, hands-on experiences (or virtual reality experiences!) with the environment. Emphasize empathy toward the environment and others.
Listen and engage people in dialogue about solutions to environmental concerns they care about.
9. Psychological underpinnings of pandemic denial - patterns of disagreement with scientific experts in the German public during the COVID-19 pandemic (5 citations)
In this study, researchers in Germany looked at how scientific experts and non-experts differed in their attitudes about COVID-19 early in the pandemic. They surveyed 1,575 laypersons and 128 scientific experts.
A third of people surveyed had dismissive or doubtful attitudes about the origins, spread and severity of COVID-19. Tobias Rothmund and colleagues attributed these attitudes to a rise of “science denial.” (Note: Many science communicators consider this term as problematic as they do the lumping together of people who doubt science for different reasons.)
Survey participants answered questions about their health beliefs and behaviors, belief in COVID-19 conspiracy theories, and knowledge about COVID-19. Based on their responses, Rothmund and colleagues classified people into four groups: Alarmed, Concerned, Doubtful, and Dismissive. Their methodology is modeled after that of Leiserowitz’s “Global Warming’s Six Americas” study.
People in the Doubtful group were likelier to believe conspiracy theories, were less open-minded, trusted scientists less than others, and were more likely to get their COVID-19 information from social media. They preferred simple versus complex explanations for why things happen (although all humans share that tendency to some extent). People in the Dismissive group had similar views but were less likely to believe they were personally at risk of infection, and less likely to take action.
It’s not clear how this categorization is helpful to science communicators. Theoretically, if you could reach each segment separately, you could consider what messaging strategies might work for each. But can we do this in practice? An alternative is to focus more on communicating about emerging science and health issues in ways that resonate with more people. That might include prioritizing stories about the trustworthiness of scientists and simple story-driven corrections of misinformation over “facts.”
10. Visualizing viruses (5 citations)
This last study is my favorite! In it, researchers in the UK provide recommendations on how we can visualize viruses like SARS-CoV-2 to gain and share insights about how these viruses work.
“[C]onstructing integrative illustrations of virus particles can challenge our thinking about the biology of viruses, [and provide] tools for science communication.”
Viruses are far too small for us to see with our human eyes, or even with a light microscope. We literally have to shoot tiny electrons at them to create pictures of them. But seeing a virus in our mind’s eye is helpful. It can help us understand the virus and can make its risks to us and its presence around us more palpable.
“[M]ethods from medical visualisation can tackle the challenge of creating a concrete visual representation of an invisible virus, even when a complete experimental description of the virus particle is lacking.”
But the question “What does a virus particle look like?” doesn’t always have a clear answer. Viruses change over time and we may not know what new viruses actually look like. Some viruses can take on different forms.
In this study, virologists and scientific illustrators collaborated to build visual models of influenza A and SARS-CoV-2. They based their visualizations on known structures of related viruses, electron microscope images, studies of the virus’ surface proteins and other scientific data. They decided to visualize the different forms that the particles of these viruses can take instead showing a single “idealized” shape. And they used interesting and intricate shape and color details while weighing an audience’s need for simplicity and recognizable context. They even made the visualizations explorable via a 3-D web viewer and in virtual reality.
“Visualisations of virus particles can stimulate creative projects in response to the pandemic. Discussion of the models on social media led to their use as the basis for a set of 3D printed virus particles. [...] We hope that these models will continue to provoke other people to reimagine and respond to the invisible viruses that have such an impact on all of us.”
More papers:
Lessons Learned From Dear Pandemic, a Social Media-Based Science Communication Project Targeting the COVID-19 Infodemic (4 citations)
An anchor in troubled times: Trust in science before and within the COVID-19 pandemic (4+ citations)
Linking Online Vaccine Information Seeking to Vaccination Intention in the Context of the COVID-19 Pandemic (4 citations)
Testicular cancer and YouTube: What do you expect from a social media platform? (9 citations). TLDR: Testicular cancer information on YouTube isn’t reliable, although patients may find useful information about treatment options and people’s experiences with them on the platform.
Towards the reflective science communication practitioner (3 citations)
Researchers at the University of Southern California created a dataset of antivaccine content and vaccine misinformation on social media. They made this data set public to help others better understand the role of online misinformation in vaccine hesitancy. Check it out! Muric, G., Wu, Y., & Ferrara, E. (2021). JMIR public health and surveillance, 7(11), e30642.