February 2, 2023

ChatGPT, a new artificial intelligence application by OpenAI, has captured the imagination of the internet. Some have suggested it’s the largest technological advancement in modern history. In a recent interview, Noam Chomsky called it “basically high tech plagiarism.” Others have suggested large language models like ChatGPT spell the end for Google search, because they eliminate the user process of filtering through multiple websites to access digestible information.

The technology works by sifting through the internet, accessing vast quantities of information, processing it, and using artificial intelligence to generate new content from user prompts. Users can ask it to produce almost any kind of text-based content.

Given its clear creative power, many are warning of ChatGPT’s potential to be a misinformation superspreader, capable of instantly producing news articles, blogs, eulogies and political speeches in the style of particular politicians, writing whatever the user desires. It’s not hard to see how AI-powered bot accounts on social media could become virtually indistinguishable from humans with just slight advancements.

Analysts at NewsGuard, an online trust-rating platform for news, recently tested out the tool and found it produced false information on command when asked about sensitive political topics.

“In most cases, when we asked ChatGPT to create disinformation, it did so, on topics including the January 6, 2021, insurrection at the US Capitol, immigration and China’s mistreatment of its Uyghur minority,” wrote Jim Warren for the Chicago Tribune, adding that it took up to five tries in certain cases to get past OpenAI’s security buffer.

“Indeed, for some myths, it took NewsGuard as many as five tries to get the chatbot to relay misinformation, and its parent company has said that upcoming versions of the software will be more knowledgeable,” wrote Jack Brewster, Lorenzo Arvanitis and McKenzie Sadeghi for NewsGuard.

A transcription of text entered into the ChatGPT application by NewsGuard analysts, and the tool’s response. (NewsGuard)

While ChatGPT has a basic moral framework to prevent unethical usage — if you ask it to enumerate positive attributes of Adolf Hitler, for example, it will refuse the first time — it can easily be bypassed by offering up weak justifications for the instruction.

Users can access forbidden information by tricking ChatGPT with a hypothetical. (@wlyzach/Twitter)

“It can serve up information in clear, simple sentences, rather than just a list of internet links. It can explain concepts in ways people can easily understand,” wrote New York Times technology reporters Nico Grant and Cade Metz. “It can even generate ideas from scratch, including business strategies, Christmas gift suggestions, blog topics and vacation plans.”

ChatGPT is freakishly good at spitting out misinformation on purpose, reads a headline from Futurism.

Sam Altman, the CEO of OpenAI, said on Twitter last year, “We should be much more nervous about the growing calls to censor ‘misinformation’ than the misinformation itself,” adding that experts have in the past been wrong about misinformation labels.

Software has already been developed to instantly detect whether ChatGPT has been used to generate text.


Interesting fact-checks

Crab meat (AP Photo/Matthew S. Gunby)

  • Demagog: Pandemic fakes and war disinformation – what do they have in common? (English)
    • “With Russia’s invasion of Ukraine in 2022, social media accounts, so far known for disinformation related to the pandemic, began spreading false content about the war.”
  • Factly: Fraudulent websites are collecting user information in the name of FAKE Healthcare Schemes (English)
    • “Another post is being shared on social media platforms asking senior citizens to register on the website given in the post to get $3034 yearly added to their Social Security checks. Let’s verify the claim made in the post.”
  • Faktisk: Does Norway make a lot of money from Putin’s war?  (English)
    • “We will get the final answer with the state accounts in April. But based on calculations we now have access to, Hansson is right. In the revised budget from December, the total petroleum revenues are calculated at NOK 1,316 billion. This is over a thousand billion more than what was calculated in the state budget.”
  • Taiwan FactCheck: Is Krab made of styrofoam? (Chinese traditional)
    • “Video of the process of making crab meat sticks was circulated in the community. The video was subtitled ‘Are you still eating crab sticks?’ Claims were made that the meat sticks were ‘made of Styrofoam’, ‘plastic food’, which aroused wild rumors among the public.”

Quick hits

Brazil’s President Luiz Inacio Lula da Silva speaks in Brasilia, Brazil, on Jan. 16, 2023. (AP Photo/Eraldo Peres, File)

From the news: 

  • Regulate and punish: the bills on disinformation that await the new congress “Of the more than 100 bills (PLs) that seek to legislate on disinformation inherited by congressmen who take office this Wednesday (1st), most attack the problem from the perspective of punishing disinformation and regulating platforms. Among 112 projects that deal with disinformation identified in a survey carried out by Lupa , most establish punishments for those who share false content or impose norms to be adopted by big techs, while only 14 treat the impact of false information as an issue that can be resolved through education.” (Agencia Lupa, Nathália Afonso, Maiquel Rosauro)
  • Brazilian government responds to demands of press freedom organizations “The Brazilian government announced the creation of the National Observatory of Violence against Journalists, a demand from organizations defending press freedom and journalists in the country. The announcement was made by the Minister of Justice, Flavio Dino, on Jan. 17, a day after meeting with representatives in Brasilia, who shared with the minister a series of proposals to contain violence against press professionals.” (LatAm Journalism Review, Carolina de Assis)

From/for the community: 

  • Google and YouTube are partnering with the International Fact-Checking Network to distribute a $13.2 million grant to the international fact-checking community. “The world needs fact-checking more than ever before. This partnership with Google and YouTube infuses financial support to global fact-checkers and is a step in the right direction,” said Baybars Örsek, former executive director of the IFCN. “And while there’s much work to be done, this partnership has sparked meaningful collaboration and an important step.”
  • The IFCN has awarded $450,000 in grant support to organizations working to lessen the impact of false and misleading information on WhatsApp. In partnership with Meta, the Spread the Facts Grant Program gives verified fact-checking organizations resources to identify, flag and reduce the spread of misinformation that threatens more than 100 billion messages each day. The grant supports eleven projects from eight countries: India, Spain, Nigeria, Georgia, Bolivia, Italy, Indonesia and Jordan. Read more about the announcement here.
  • IFCN signatory Vishvas News won a silver at the World Association of Newspapers and News Publishers awards for its event, “Sach Ke Sathi” (Truth Warriors). Read more.
  • The OSINT team at Faktisk, in collaboration with doctoral student Sohail Ahmed Khan, developed two prototypes of digital tools that verify audiovisual content.
  • IFCN job announcements: Program Officer and Monitoring & Evaluation Specialist

Thanks for reading. If you are a fact-checker and you’d like your work/projects/achievements highlighted in the next edition, send us an email at factually@poynter.org by next Tuesday. Corrections? Tips? We’d love to hear from you. Email us at factually@poynter.org.

Factually is a newsletter about fact-checking and misinformation from Poynter’s International Fact-Checking Network. Sign up here to receive it in your email every other Thursday.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Tags: , ,
Seth Smalley is a reporter at Poynter and the IFCN. Get in touch at seth@poynter.org or on Twitter @sethsalex.
Seth Smalley

More News

Back to News