March 15, 2021

One year ago, Tom Hanks and his wife, Rita Wilson, tested positive for the novel coronavirus. The NBA suspended its season. About 1,000 Americans were infected, and the World Health Organization declared a pandemic.

Today, more than 500,000 Americans are dead from COVID-19. The country is recovering from the worst economic crisis since the Great Depression. Schools and colleges, many shuttered since last spring, are slowly reopening. Playing in the background: a symphony of falsehoods downplaying and denying the severity of the pandemic.

Since the novel coronavirus first emerged in late 2019, PolitiFact has fact-checked nearly 800 claims — 60% of which we rated False or Pants on Fire! In the early days of the pandemic, much of the misinformation focused on ways to cure or prevent COVID-19. Now, disinformation is casting doubt on the efficacy and safety of coronavirus vaccines.

Here are a few things we’ve observed and learned while fact-checking coronavirus claims over the past year — and some lessons for how to avoid misinformation in what we hope is the twilight of the pandemic.

1. Speculation thrives in the unknown

In spring 2020, there was a lot that scientists didn’t know about the coronavirus, including where it came from, how it spread or how to treat it.

Those uncertainties left a gap in public knowledge about the virus and the disease it caused, COVID-19. Misinformation filled that gap.

Facebook users speculated that sunlight could kill the coronavirus (False). Others said drinking water or — better yet — hot lemon water could prevent infection (Both False). Vegan Instagram accounts said there’d be no pandemic if people didn’t eat animals (Mostly False). Health misinformation websites spread conspiracy theories that the virus was created in a lab (False).

Scientists know more about the novel coronavirus in 2021 than they did last year, but there is still no known cure or treatment for COVID-19. So remain skeptical of any claims on social media or elsewhere that seem to lack context about the virus. Remember: Don’t share something on social media unless you know it’s true.

2. Denying reality has broad appeal

From the earliest days of the pandemic, there was a concerted effort by politicians and commentators to deny or downplay the severity of COVID-19.

Pundits like Rush Limbaugh and David Harris Jr. likened the coronavirus to the common cold (False) and the flu (Pants on Fire). Politicians like Sen. Marco Rubio, R-Fla., and Florida Gov. Ron DeSantis selectively cited data to make the virus seem less deadly than it actually was. On social media, some denied the existence of COVID-19 altogether or posted photos of empty hospital beds to claim the crisis was fake.

Of course, the crisis was real — weekly hospitalizations and daily deaths shot up between March and April. But claims denying that reality continued to proliferate, particularly after states imposed coronavirus restrictions and face mask mandates. The denialism continued well into the fall, when former President Donald Trump said the country was “rounding the turn” on COVID-19 (False). Within weeks, cases and deaths shot up again.

“When people are fearful, they seek information to reduce uncertainty,” said Jeff Hancock, a communication professor at Stanford University, in a March 2020 blog post. “This can lead people to believe information that may be wrong or deceptive because it helps make them feel better, or allows them to place blame about what’s happening.”

3. Doctors can be good disinformers

A health worker wearing a protective suit is disinfected in a portable tent outside the Gat Andres Bonifacio Memorial Medical Center in Manila, Philippines, on April 27, 2020, during an enhanced community quarantine to prevent the spread of the coronavirus. (AP Photo/Aaron Favila)

In May, millions of people saw a video that claimed face masks don’t work, that the coronavirus was manipulated and that the drug hydroxychloroquine was an effective way to treat it. The messenger: a scientist named Judy Mikovits.

The impact of her video, “Plandemic: The Hidden Agenda Behind COVID-19,” showed that misinformation about COVID-19 is particularly effective when the source appears to have an air of authority. Mikovits’ claims were wrong, and her work on chronic fatigue syndrome and vaccines has been widely discredited, but the 26-minute “Plandemic” video still reached millions of people on Facebook, Instagram, Twitter and YouTube.

Its slick editing and marketing set the bar for other videos that contained COVID-19 misinformation. In July, Breitbart published a clip that showed doctors in white lab coats outside the U.S. Supreme Court. They discouraged mask-wearing and falsely said hydroxychloroquine, a drug used to treat rheumatoid arthritis and lupus, could cure the coronavirus.

Some of the most effective coronavirus misinformation comes from people who appear to know what they’re talking about. Just because someone wears a white lab coat and has a medical degree does not mean their claims about COVID-19 are accurate. Get your facts from public health agencies with good track records like the Centers for Disease Control and Prevention to avoid being duped.

4. Conspiracy theories overlap

What do anti-vaccine advocates, QAnon supporters and lockdown protesters have in common? In 2020, they all teamed up to spread conspiracy theories about COVID-19.

Microsoft co-founder Bill Gates was a common target, falsely accused of planting microchips in people via vaccines and patenting the coronavirus to profit from its treatment. Supporters of QAnon, a baseless conspiracy theory about child sex trafficking, promoted hydroxychloroquine as a COVID-19 cure and false claims about vaccines. False claims doubting the efficacy of face masks spread widely in alternative health and pro-Trump communities.

Experts say the coordination between different conspiracy groups, such as left-leaning anti-vaccine communities and right-leaning QAnon supporters, in pushing these false claims was surprising. But given how social media platforms work, it shouldn’t have been.

“These networks are wired to spread disinformation about the virus,” said Kate Starbird, an associate professor at the University of Washington, in a December interview. “They’re highly motivated for political reasons to select claims or pieces of evidence to support their narrative — and to seek to downplay its effects or claim that masks or social distancing not work for whatever reason.”

5. Falsehoods can flow from the top down

Protesters demanding Florida businesses and government reopen, march in downtown Orlando, Fla., Friday, April 17, 2020. Small-government groups, supporters of President Donald Trump, anti-vaccine advocates, gun rights backers and supporters of right-wing causes have united behind a deep suspicion of efforts to shut down daily life to slow the spread of the coronavirus. (AP Photo/John Raoux)

When you think of online misinformation, you may think of social media rumors, conspiracy theorists or foreign disinformation campaigns. But in 2020, a lot of coronavirus misinformation came from the White House.

In February 2020, Trump told the country to view the coronavirus “the same as the flu,” a line echoed by pundits and Facebook users throughout the year. He promoted hydroxychloroquine as a potential coronavirus treatment, setting the stage for others to claim the drug was a miracle cure. In the weeks leading up to Election Day, Trump falsely claimed the U.S. was padding COVID-19 death numbers.

Sometimes, the president would repeat misinformation he saw online, such as a bogus claim that only 6% of COVID-19 deaths were actually caused by the virus. But Trump himself was the source of much of the coronavirus misinformation we saw over the past year, illustrating how those with the biggest platforms have the biggest potential to misinform.

“Bottom-up misinformation seems inevitable in any circumstance like this,” said Brendan Nyhan, a government professor at Dartmouth College, in a December interview, adding: “But it wasn’t inevitable that our elites would downplay the severity of this threat for as long as they have.”

6. Misinformation has consequences

False COVID-19 claims are more than just a distraction from the public health crisis — they can directly affect people’s lives and health.

A Florida taxi driver and his wife got sick after believing online conspiracy theories about face masks and 5G data networks; the wife later died from complications related to COVID-19. A lecturer at Columbia University had trouble filling her prescription for a lupus drug after Trump and others touted hydroxychloroquine as a coronavirus cure. Nurses and doctors on the frontline of the pandemic took to social media to dispel rumors, some of which they heard from patients who were sick with COVID-19.

The effect of misinformation can also be seen in polling data from 2020. An NPR/Ipsos poll from December showed that 40% of Americans falsely believed the coronavirus was created in a lab in China. In July, a quarter of Americans thought there was some truth to the conspiracy theory that the pandemic was planned. Throughout the year, about 30% of Americans thought the threat “has been exaggerated” and the virus was “purposely created and released.”

It’s not all bad news, though.

Joseph Uscinski, a conspiracy theory expert at the University of Miami, conducted the polling on Americans’ belief in an exaggerated or fabricated virus. He said the percentage of people who held those beliefs stayed relatively stable last year, meaning that the spread of conspiracy theories doesn’t necessarily cause an increase in believers. And as the coronavirus vaccine rollout continues, a February Gallup poll found that 71% of Americans are willing to receive the vaccine — an increase from 50% in the fall.

Katie Sanders contributed to this report.

This article was originally published by PolitiFact, which is part of the Poynter Institute. It is republished here with permission. See the sources for these facts checks here and more of their fact-checks here.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
More by Daniel Funke

More News

Back to News