Tech CEOs are back in the hot seat
As the CEOs of Facebook, Twitter and Google prepare for their quarterly congressional grilling, the three tech companies and various advocacy groups have been fighting a public relations battle in the press.
On March 18, human rights group Avaaz published a report faulting Facebook for enabling groups promoting falsehoods about the 2020 U.S. presidential election. The group argued if the company had acted more aggressively to curtail the reach of some of these falsehoods and the groups propagating them, it could have prevented the Jan. 6 attack on the U.S. Capitol.
On Monday, Facebook vice president for integrity Guy Rosen published a rebuttal in an op-ed in Morning Consult. He cited the company’s efforts to remove over 1.3 billion fake accounts in late 2020 and its sacrificing of 5% of its site traffic after modifications to its news feed in 2018 as evidence of Facebook’s earnest efforts to address online falsehoods.
Speaking to BBC Radio’s World Questions program, Twitter’s director of public policy strategy Nick Pickles explained the difficulty social media companies face trying to act as global referees for online speech.
“Removing content in some cases makes peoples’ views harden, and in some cases restricts the debate, it restricts the ability to inform and challenge people who have ideas that are not accurate,” Pickles said. He added that the COVID-19 pandemic clarified the potential harms of certain types of misinformation, but it didn’t make social media companies’ jobs any easier.
Speaking on the same program, Bellingcat founder Eliot Higgins said too much focus on social media companies misses the broader picture of online misinformation.
“People have lost their faith in traditional sources of authority,” Higgins said. “They go online and they look for alternative sources of authority.” He argued for a more nuanced approach that considers people’s motivations for why they may be seeking out alternative sources of information, rather than blanket bans that scatter users towards harder-to-reach corners of the internet.
- theJournal.ie: “Covid-19 claims shared on a LED display driven around a number of locations in Ireland are false” (In English)
- Irish purveyors of COVID-19 and vaccine falsehoods are getting creative. TheJournal.ie discovered a bevy of false claims towed around Ireland on an LED sign. Reporter Lauren Boland systematically debunked claims about vaccines, masks, and COVID-19 testing.
- Istinomer: “It’s not true that those vaccinated against COVID-19 can’t give blood” (In Serbian)
- This viral post spread by a well-known Serbian sharer of vaccine disinformation falsely claimed that getting vaccinated for COVID-19 will disqualify you from donating blood. Istinomer showed and simply explained the guidance from Serbia’s Institute for Blood Transfusion, which states people given certain vaccines may be asked to wait to donate blood, but won’t be banned.
From the news:
- “Russia outsources disinformation efforts to foreign troll farms,” from Roll Call. An unclassified report from the Office of the Director of National Intelligence confirmed that Russia used companies and individuals in Ghana, Mexico and Nigeria to seed disinformation into the American information ecosystem.
- “How Syria’s war of disinformation reshaped the world,” from the Independent. International correspondent Borzou Daragahi argues the Syrian government’s use of disinformation to deter foreign involvement in its civil war influenced tactics later used by Russia and the Jan. 6 Rioters.
- “‘Ya Basta Facebook’ Campaign Presses Company To Curb Spanish-Language Misinformation,” from Wyoming Public Media. Advocacy group “the Real Facebook Oversight Board” is calling on the tech giant to do more to address the apparent lack of content moderation for Spanish-speaking content in the United States. They cite a study from the human rights group Avaaz that found 70% of Spanish-language COVID-19 misinformation was neither labeled nor removed compared to 30% of similar content in English.
From/for the community:
- “A framework for information incidents,” from Full Fact. After seven months of consultation with fellow fact-checkers, tech companies, academics and government representatives, Full Fact has published a global framework for how each entity can respond to future infodemics. The plan uses a five-tier system to determine the severity of the infodemic and suggests appropriate responses. Full Fact is seeking feedback from the public about the proposed system until May 14.
- Turkish fact-checking organization Teyit saw a huge positive social media response to its explainer articles debunking misconceptions about the Istanbul Convention — an international human rights treaty that aims to combat domestic violence and violence against women. Turkish President Recep Tayyip Erdoğan removed Turkey from the agreement last week following domestic criticism that the treaty was an attack on Turkish family values.
- “Aspen Institute Commission on Information Disorder Announces Full Member List and Planning Roadmap,” from the Aspen Institute. An 18-member commission chaired by Katie Couric, Chris Krebs and Rashad Robinson will look into policy solutions to address harmful disinformation plaguing the American information ecosystem.
If you are a fact-checker and you’d like your work/projects/achievements highlighted in the next edition, send us an email at firstname.lastname@example.org by next Tuesday.
Any corrections? Tips? We’d love to hear from you: email@example.com.
Thanks for reading Factually.