Public confidence in scientists is on the upswing in the United States. According to a survey released by the Pew Research Center in August, 60% of Americans say “scientists should play an active role in policy debates about scientific issues.” This means that confidence in scientists is as high as confidence in the military.
At the same time, no one disagrees that social media channels in the United States and elsewhere are flooded by falsehoods about all sorts of health topics. A simple search on Facebook, Instagram, Youtube and Twitter for coronavirus, cancer, diabetes, vaccines or weight loss, for example, would probably pop up a bunch of hoaxes and conspiracy theories. The same would happen on Google, Bing and other engines.
This shows that even though the majority of people might have a positive relationship with science, their attitudes toward specific areas of this field highly depend on their worldview, culture, religion, friends and family. This can be pretty scary.
Saturday, a group of experts met at the National Academies of Sciences, Engineering, and Medicine, in Washington, D.C., for the 2020 MisinfoCon, and discussed this topic for more than eight hours. What is the best strategy to overcome these barriers? What can be done?
At least six of the speakers invited to take the stage at NASEM gave clear suggestions on how media, fact-checkers, health organizations and governments could improve the work they do when communicating health-based facts. The IFCN captured a few of them below:
1) Delivering more information doesn’t mean they will agree with the facts
Kara Laney, a senior program officer at NASEM, suggested that instead of flooding the audience that rejects scientific-based facts with tons of information, organizations of all kinds should remember that “not all audiences are the same.” Whoever is communicating a scientific fact should draw different strategies for reaching different audiences. Moreover, individual beliefs should always be taken into consideration when planning methods for communicating a fact someone is not willing to hear. Tailoring messages might be the most effective way to make people hear a certain type of content.
2) Allow peeks behind the curtain and use sources they can relate to
Adam Cole, the science journalist who recently wrote and directed “The Mind, Explained,” a Netflix miniseries about memory, dreams, anxiety, meditation and psychedelic drugs, presented two ideas at MisinfoCon. First, he suggested that to gain trust, communicators should be able to show the audience how they researched a certain topic and how they concluded something was right or wrong. Being able to walk side by side and step by step with the audience toward the elucidation of a question can be powerful. So show drafts, notes, doubts. Then, Cole said, it is important to use sources (people) with whom the audience can resonate. In short, if the message should be heard by women, then use women as sources.
3) Platforms should find a way to highlight trustworthy sources
Wen-Ying Sylvia Chou, the program director for the National Cancer Institute and the National Institute of Health, presented the results of an eye-tracking study her team did on Facebook using hoaxes related to cancer and emphasized that the platform must find a way to highlight trustworthy pages and sources on its News Feed. The eye-tracking map her staff produced after interviewing 52 people shows that Facebook users pay a lot of attention and spend a lot of time analyzing the source of a post, but it also indicated that it is hard to differentiate The National Cancer Institute, a real organization, from The National Federation of Cancer Hospitals, a false entity that created a fancy name just to gain online respect and fool people.
4) Correct others on social media — it works
Leticia Bode, associate professor at Georgetown University, proved with numbers that corrections are effective. It doesn’t matter if they are presented by the platforms (through their algorithms), by reputable organizations or by regular users amongst themselves. According to Bode, corrections should happen as fast as possible and provide quality URLs with clear headlines. “Observational correction” is also a powerful that must be stimulated. “Observation correction happens when I am not getting corrected but somebody else is and I see it”.
5) Use local influencers
Joe Smyser, chief executive officer for Public Good Projects, raised an interesting question during MinsinfoCon. If only 2% of all U.S. kindergarteners have vaccine exemptions and most people in the country — and everywhere else — are in favor of vaccines, why aren’t they openly talking about this position? Defending it? Smyser suggested that the fear of being digitally harassed could be behind this situation and said strategic actions in small communities could be a smart way out. Communicators should first find online local influencers and ask them to defend scientific-based facts — people who are willing to post photos taking a flu shot, for example. When that happens, he said, others in the same community will feel more comfortable to comment online and defend vaccines. It’s the domino effect.
6) Health literacy should be taught at school / university
Kristy Roschke, the managing director of News Co/Lab at Arizona State University’sCronkite School of Journalism showed the results of research titled “How students engage with the news.” She demonstrated that professors play a huge role. So if the news is being discussed in class, how about exploring the importance of facts, too?
* Cristina Tardáguila is the associate director of the International Fact-Checking Network and the founder of Agência Lupa. She can be reached at firstname.lastname@example.org.