January 26, 2012

The protests that erupted in Nigeria earlier this month over skyrocketing fuel prices initially received only a small amount of coverage from North American media, including CNN.

But CNN’s iReport team soon began seeing a steady stream of photos, videos and on the ground reports submitted by users in Nigeria. It became clear they couldn’t ignore the story.

“CNN wasn’t really covering that story at all until we started seeing an outpouring of contributions of video and photos and people writing into iReport over and over for days,” said Lila King, participation director for CNN Digital. “It made us say, ‘Gosh, you know we really need to be paying attention to this.’ ”

The result was increased coverage on CNN properties featuring the material submitted by a range of Nigerian citizens and freelancers.

This is iReport’s fifth anniversary, and a CNN spokesperson calls it “the most developed and active citizen journalism platform of any news organization worldwide.” It claims 1,002,428 registered iReporters, and 2.4 million unique users each month. King said iReport has had content submitted from every country on earth.

While some iReporters are freelance journalists, or aspiring journalists, most are unconnected to the press. All are unpaid. They see something, shoot a video or photo, and upload it to the site. Others eschew breaking news contributions and instead upload videos of themselves offering commentary about politics, world affairs or local issues.

The steady stream of content submitted from people the world over is a boon to CNN, but represents a major verification challenge. None of the photos, videos or other types of submitted content are used on other CNN properties until they are verified. So how do you verify citizen content submitted from people across the globe?

I spoke with King recently to have her outline the iReport vetting process. She said iReport’s team of eight full-time producers vet roughly 8 percent of all submitted reports per year. That may seem low, but it adds up when you realize an average of 500 iReports are submitted on average each day.

Though the iReport site doesn’t require all material to be vetted, King said they moderate all submissions to remove offensive or copyright infringing material, for example.

“We don’t prescreen anything before it’s posted [to the iReport website], but we do apply a level of moderation to every single piece of content,” King said. “Let’s say I upload a video, it will published directly to the site as long as I register and give some contact information.”

The process CNN calls vetting kicks in when a submission could be used in whole or in part by a CNN property other than iReport.

“What we call vetting we do only to things [we] want to highlight inside CNN proper,” King said. “There is a moderator who looks at the piece and will see if it meets the brand standards for what CNN can host on its site.”

Once a piece of content is deemed vetted, an iReport logo is added to it, and a producer adds a note to the story page to offer some background. Unvetted content has “NOT VETTED BY CNN” added to it in order to alert readers.

iReport used to pop up a message to new visitors to help explain the difference between vetted and unvetted content:

King said that was removed in a recent redesign.

“The popup was a bit bothersome to some users and a topic the iReport community mentioned frequently,” she said. “We didn’t lose the content of the popup, though – you can still see the ‘not vetted’ messages clearly displayed across iReports that haven’t been vetted.”

Phase One: Contact contributor

Vetting begins with an iReport producer reaching out to the person who uploaded the content in order to ask questions by phone, email, Skype or whatever option is available. It’s about “verifying facts and making sure the story is real and the person is who they say they are, the thing happened when they say it did in the place they say it did,” according to King.

When the report being vetted includes photos, video or news from someone on the ground, as opposed to, say, an opinion piece, the producer will work to get a sense of everything the person saw, heard and felt, rather than just checking what was submitted.

For example, Tina Armstrong, a freelance journalist based in Nigeria, was first contacted for vetting in December 2011 after she submitted an iReport about attending the UN Summit on Climate Change.

“At first I thought it was a scam and had to Google to confirm,” she said, replying to a message I sent her within the iReport system. “When I found out it was for real, I answered the questions that were sent to me [about the] report I uploaded. Among the questions [they] asked were, if I was actually at the venue of the event, the type of camera I was using, what I think about the event and if anybody had paid me to do the report.”

It’s common for an iReport producer to ask questions about more than just the material that was submitted. In this sense, an iReporter is a hybrid reporter/source. The material they send is treated as reporting, but their personal experience and knowledge makes them an important source of additional information and context.

“Really, a piece of video in many cases is just wallpaper until you can talk to a human being who is part of the story and watched it happen,” King said. “So much of what we do is about getting in touch and forming a relationship and getting a sense of what they went through.”

Phase Two: Verify information

Once a producer gathers information from the iReporter, the second phase of vetting kicks in.

“What we need to do once we’ve collected information is go and verify details, so we do lots of different things,” she said. “We will call the international desk at CNN and find the person who is an expert in the area wherever the story took place, and figure out what are they hearing, what do they know … We will check local media and will call local people in field. We will call our affiliates and try to get to a place where we can confirm as many details as possible.”

In terms of challenges, King said it’s sometimes difficult to get in touch with the person who submitted a report. This was particularly challenging during the Arab Spring, when people would submit things and not want to follow up with iReport for fear of suffering consequences.

“They don’t want, in a lot of cases, their names or identities associated with the photos they were taking,” she said.

Handling Suspicious Content

In the early days of iReport there was a concern that people would use the platform to submit fake reports and spread misinformation. King said they’re still on the lookout for suspicious content, but the site has so far avoided being fooled during the vetting process.

“I’m happy and pleased to say at this point — more than five years in — we’ve never had something that we vetted and put on air that we had to take back. We are batting .1000,” she said. “Maybe it’s because in some ways I think we’re maybe a little more conservative than we need to be, but I think the conservatism works in our favor.”

That said, its unvetted content has caused problems. In 2008, for example, a user submitted a false story that Apple CEO Steve Jobs had suffered a heart attack. The report caused Apple stock to fall, and attracted unwelcome scrutiny of iReport’s policy of allowing anyone to post a report. Apple issued a denial to help tamp down rumors, and CNN issued a statement.

With that in mind, I asked King what makes the iReport team suspicious of a submission. Many times, it’s because a piece of content is too polished, she said.

“It’s a story that’s actually written very much like a [traditional] news story,” King said. “It’s like 800 words with very short paragraphs and a dateline at the top with a standard newspaper lead. That, for us, tends to be a flag that someone has just copy and pasted from some other place.”

Though King said iReport has never placed its “vetted by CNN” seal on a fake or inaccurate piece of content, she did recall one example when their vetting process caught a problem.

“In the early days of iReport there was a big story with wildfires threatening Southern California and people being evacuated from homes,” King said. “We had no current images and a photo came in to iReport that was picture perfect. It was exactly the homepage photo you would want. It was taken inside someone’s apartment and you could see fire coming over the mountain through the window.”

A producer called the woman who submitted the photo, and her boyfriend answered the phone.

“He said, ‘Oh dude, I’m so glad you called because she’s been working on this for hours,’ ” King said.

It turned out the woman had taken a photo and been painstakingly touching it up in Photoshop to make it spectacular.

“We got her to send the original photo and it just wasn’t a very good photo,” King said. “She was not trying to run some scam — she was just trying to make an awesome photo.”

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Craig Silverman (craig@craigsilverman.ca) is an award-winning journalist and the founder of Regret the Error, a blog that reports on media errors and corrections, and trends…
Craig Silverman

More News

Back to News

Comments

Comments are closed.